Attention-Guided deep atrous-residual U-Net architecture for automated gland segmentation in colon histopathology images

In digital pathology, gland segmentation plays a dominant part in the diagnosis and quantification of colon cancer. Thus, this paper presents a clinically relevant deep learning-based automated gland segmentation technique called Attention-Guided deep Atrous-Residual U-Net that aims to seize small a...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Manju Dabass, Sharda Vashisth, Rekha Vig
Formato: article
Lenguaje:EN
Publicado: Elsevier 2021
Materias:
Acceso en línea:https://doaj.org/article/ae08adc062c449289554a5e385e86c3e
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:ae08adc062c449289554a5e385e86c3e
record_format dspace
spelling oai:doaj.org-article:ae08adc062c449289554a5e385e86c3e2021-11-12T04:42:15ZAttention-Guided deep atrous-residual U-Net architecture for automated gland segmentation in colon histopathology images2352-914810.1016/j.imu.2021.100784https://doaj.org/article/ae08adc062c449289554a5e385e86c3e2021-01-01T00:00:00Zhttp://www.sciencedirect.com/science/article/pii/S2352914821002550https://doaj.org/toc/2352-9148In digital pathology, gland segmentation plays a dominant part in the diagnosis and quantification of colon cancer. Thus, this paper presents a clinically relevant deep learning-based automated gland segmentation technique called Attention-Guided deep Atrous-Residual U-Net that aims to seize small and intricate variations in medical images whilst preserving spatial information. It is a modified U-Net architecture that comprises enhanced learning capability as its Atrous-Residual units extract more discriminative and multi-level feature representations while alleviating the diminishing-gradient issue. Its Attention units extract more gland-specific detailed features, perform target refinement, and help in comparable semantic features concatenation. Its Transitional Atrous units incorporate dense multi-scale features and handle resolution degradation problems. Also, various data augmentation and stain normalization techniques are utilized to improve its generalization capability. Extensive experiments are performed using two publicly available datasets (GlaS challenge and CRAG) and a private hospital dataset HosC that helps to build invariance of the proposed architecture towards digital variability present in clinical applications and verifying its robustness. The proposed model achieves competitive results over existing state-of-art techniques having a significant improvement in F1-score (at least 2% for GlaS and 3.7% for CRAG), Object-Dice Index (at least 2.3% for GlaS and 3.5% for CRAG), and Object-Hausdorff Distance (at least 2.89% for GlaS and 3.11% for CRAG). Also, for the private HosC dataset, it achieves an F1-score of 0.947, Object-Dice Index of 0.912, and Object-Hausdorff Distance of 89.78. In addition, the final output results are validated by multiple pathologists in terms of their score i.e., for GlaS (0.9184 for test A and 0.91 for test B), 0.9032 for CRAG, and 0.904 for HosC that verify its clinical relevancy and suitability in the facilitation of the proposed model for gland detection and segmentation applications in clinical practice. It can further aid the pathologists to create a precise diagnosis and plan of treatment.Manju DabassSharda VashisthRekha VigElsevierarticleColon histopathology imagesAutomated gland segmentationDeep learningResidual learningAttention mechanismMulti-scale feature fusionComputer applications to medicine. Medical informaticsR858-859.7ENInformatics in Medicine Unlocked, Vol 27, Iss , Pp 100784- (2021)
institution DOAJ
collection DOAJ
language EN
topic Colon histopathology images
Automated gland segmentation
Deep learning
Residual learning
Attention mechanism
Multi-scale feature fusion
Computer applications to medicine. Medical informatics
R858-859.7
spellingShingle Colon histopathology images
Automated gland segmentation
Deep learning
Residual learning
Attention mechanism
Multi-scale feature fusion
Computer applications to medicine. Medical informatics
R858-859.7
Manju Dabass
Sharda Vashisth
Rekha Vig
Attention-Guided deep atrous-residual U-Net architecture for automated gland segmentation in colon histopathology images
description In digital pathology, gland segmentation plays a dominant part in the diagnosis and quantification of colon cancer. Thus, this paper presents a clinically relevant deep learning-based automated gland segmentation technique called Attention-Guided deep Atrous-Residual U-Net that aims to seize small and intricate variations in medical images whilst preserving spatial information. It is a modified U-Net architecture that comprises enhanced learning capability as its Atrous-Residual units extract more discriminative and multi-level feature representations while alleviating the diminishing-gradient issue. Its Attention units extract more gland-specific detailed features, perform target refinement, and help in comparable semantic features concatenation. Its Transitional Atrous units incorporate dense multi-scale features and handle resolution degradation problems. Also, various data augmentation and stain normalization techniques are utilized to improve its generalization capability. Extensive experiments are performed using two publicly available datasets (GlaS challenge and CRAG) and a private hospital dataset HosC that helps to build invariance of the proposed architecture towards digital variability present in clinical applications and verifying its robustness. The proposed model achieves competitive results over existing state-of-art techniques having a significant improvement in F1-score (at least 2% for GlaS and 3.7% for CRAG), Object-Dice Index (at least 2.3% for GlaS and 3.5% for CRAG), and Object-Hausdorff Distance (at least 2.89% for GlaS and 3.11% for CRAG). Also, for the private HosC dataset, it achieves an F1-score of 0.947, Object-Dice Index of 0.912, and Object-Hausdorff Distance of 89.78. In addition, the final output results are validated by multiple pathologists in terms of their score i.e., for GlaS (0.9184 for test A and 0.91 for test B), 0.9032 for CRAG, and 0.904 for HosC that verify its clinical relevancy and suitability in the facilitation of the proposed model for gland detection and segmentation applications in clinical practice. It can further aid the pathologists to create a precise diagnosis and plan of treatment.
format article
author Manju Dabass
Sharda Vashisth
Rekha Vig
author_facet Manju Dabass
Sharda Vashisth
Rekha Vig
author_sort Manju Dabass
title Attention-Guided deep atrous-residual U-Net architecture for automated gland segmentation in colon histopathology images
title_short Attention-Guided deep atrous-residual U-Net architecture for automated gland segmentation in colon histopathology images
title_full Attention-Guided deep atrous-residual U-Net architecture for automated gland segmentation in colon histopathology images
title_fullStr Attention-Guided deep atrous-residual U-Net architecture for automated gland segmentation in colon histopathology images
title_full_unstemmed Attention-Guided deep atrous-residual U-Net architecture for automated gland segmentation in colon histopathology images
title_sort attention-guided deep atrous-residual u-net architecture for automated gland segmentation in colon histopathology images
publisher Elsevier
publishDate 2021
url https://doaj.org/article/ae08adc062c449289554a5e385e86c3e
work_keys_str_mv AT manjudabass attentionguideddeepatrousresidualunetarchitectureforautomatedglandsegmentationincolonhistopathologyimages
AT shardavashisth attentionguideddeepatrousresidualunetarchitectureforautomatedglandsegmentationincolonhistopathologyimages
AT rekhavig attentionguideddeepatrousresidualunetarchitectureforautomatedglandsegmentationincolonhistopathologyimages
_version_ 1718431251477561344