Phase Segmentation in Atom-Probe Tomography Using Deep Learning-Based Edge Detection
Abstract Atom-probe tomography (APT) facilitates nano- and atomic-scale characterization and analysis of microstructural features. Specifically, APT is well suited to study the interfacial properties of granular or heterophase systems. Traditionally, the identification of the interface between, for...
Guardado en:
Autores principales: | , , , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Nature Portfolio
2019
|
Materias: | |
Acceso en línea: | https://doaj.org/article/44bbc6b308d1471da03b8e1a1ab3c668 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Sumario: | Abstract Atom-probe tomography (APT) facilitates nano- and atomic-scale characterization and analysis of microstructural features. Specifically, APT is well suited to study the interfacial properties of granular or heterophase systems. Traditionally, the identification of the interface between, for precipitate and matrix phases, in APT data has been obtained either by extracting iso-concentration surfaces based on a user-supplied concentration value or by manually perturbing the concentration value until the iso-concentration surface qualitatively matches the interface. These approaches are subjective, not scalable, and may lead to inconsistencies due to local composition inhomogeneities. We introduce a digital image segmentation approach based on deep neural networks that transfer learned knowledge from natural images to automatically segment the data obtained from APT into different phases. This approach not only provides an efficient way to segment the data and extract interfacial properties but does so without the need for expensive interface labeling for training the segmentation model. We consider here a system with a precipitate phase in a matrix and with three different interface modalities—layered, isolated, and interconnected—that are obtained for different relative geometries of the precipitate phase. We demonstrate the accuracy of our segmentation approach through qualitative visualization of the interfaces, as well as through quantitative comparisons with proximity histograms obtained by using more traditional approaches. |
---|