Deep Learning-Augmented Head and Neck Organs at Risk Segmentation From CT Volumes

Purpose: A novel deep learning model, Siamese Ensemble Boundary Network (SEB-Net) was developed to improve the accuracy of automatic organs-at-risk (OARs) segmentation in CT images for head and neck (HaN) as well as small organs, which was verified for use in radiation oncology practice and is there...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Wei Wang, Qingxin Wang, Mengyu Jia, Zhongqiu Wang, Chengwen Yang, Daguang Zhang, Shujing Wen, Delong Hou, Ningbo Liu, Ping Wang, Jun Wang
Formato: article
Lenguaje:EN
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://doaj.org/article/945aae3dad994147b8c34a2fa781bd23
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:945aae3dad994147b8c34a2fa781bd23
record_format dspace
spelling oai:doaj.org-article:945aae3dad994147b8c34a2fa781bd232021-11-30T14:13:08ZDeep Learning-Augmented Head and Neck Organs at Risk Segmentation From CT Volumes2296-424X10.3389/fphy.2021.743190https://doaj.org/article/945aae3dad994147b8c34a2fa781bd232021-11-01T00:00:00Zhttps://www.frontiersin.org/articles/10.3389/fphy.2021.743190/fullhttps://doaj.org/toc/2296-424XPurpose: A novel deep learning model, Siamese Ensemble Boundary Network (SEB-Net) was developed to improve the accuracy of automatic organs-at-risk (OARs) segmentation in CT images for head and neck (HaN) as well as small organs, which was verified for use in radiation oncology practice and is therefore proposed.Methods: SEB-Net was designed to transfer CT slices into probability maps for the HaN OARs segmentation purpose. Dual key contributions were made to the network design to improve the accuracy and reliability of automatic segmentation toward the specific organs (e.g., relatively tiny or irregularly shaped) without sacrificing the field of view. The first implements an ensemble of learning strategies with shared weights that aggregates the pixel-probability transfer at three orthogonal CT planes to ameliorate 3D information integrity; the second exploits the boundary loss that takes the form of a distance metric on the space of contours to mitigate the challenges of conventional region-based regularization, when applied to highly unbalanced segmentation scenarios. By combining the two techniques, enhanced segmentation could be expected by comprehensively maximizing inter- and intra-CT slice information. In total, 188 patients with HaN cancer were included in the study, of which 133 patients were randomly selected for training and 55 for validation. An additional 50 untreated cases were used for clinical evaluation.Results: With the proposed method, the average volumetric Dice similarity coefficient (DSC) of HaN OARs (and small organs) was 0.871 (0.900), which was significantly higher than the results from Ua-Net, Anatomy-Net, and SRM by 4.94% (26.05%), 7.80% (24.65%), and 12.97% (40.19%), respectively. By contrast, the average 95% Hausdorff distance (95% HD) of HaN OARs (and small organs) was 2.87 mm (0.81 mm), which improves the other three methods by 50.94% (75.45%), 88.41% (79.07%), and 5.59% (67.98%), respectively. After delineation by SEB-Net, 81.92% of all organs in 50 HaN cancer untreated cases did not require modification for clinical evaluation.Conclusions: In comparison to several cutting-edge methods, including Ua-Net, Anatomy-Net, and SRM, the proposed method is capable of substantially improving segmentation accuracy for HaN and small organs from CT imaging in terms of efficiency, feasibility, and applicability.Wei WangWei WangQingxin WangQingxin WangQingxin WangMengyu JiaZhongqiu WangChengwen YangChengwen YangDaguang ZhangDaguang ZhangShujing WenDelong HouNingbo LiuNingbo LiuPing WangPing WangJun WangJun WangFrontiers Media S.A.articleradiotherapyconvolutional neural networksautomatic segmentationhead and neck cancerdeep learningPhysicsQC1-999ENFrontiers in Physics, Vol 9 (2021)
institution DOAJ
collection DOAJ
language EN
topic radiotherapy
convolutional neural networks
automatic segmentation
head and neck cancer
deep learning
Physics
QC1-999
spellingShingle radiotherapy
convolutional neural networks
automatic segmentation
head and neck cancer
deep learning
Physics
QC1-999
Wei Wang
Wei Wang
Qingxin Wang
Qingxin Wang
Qingxin Wang
Mengyu Jia
Zhongqiu Wang
Chengwen Yang
Chengwen Yang
Daguang Zhang
Daguang Zhang
Shujing Wen
Delong Hou
Ningbo Liu
Ningbo Liu
Ping Wang
Ping Wang
Jun Wang
Jun Wang
Deep Learning-Augmented Head and Neck Organs at Risk Segmentation From CT Volumes
description Purpose: A novel deep learning model, Siamese Ensemble Boundary Network (SEB-Net) was developed to improve the accuracy of automatic organs-at-risk (OARs) segmentation in CT images for head and neck (HaN) as well as small organs, which was verified for use in radiation oncology practice and is therefore proposed.Methods: SEB-Net was designed to transfer CT slices into probability maps for the HaN OARs segmentation purpose. Dual key contributions were made to the network design to improve the accuracy and reliability of automatic segmentation toward the specific organs (e.g., relatively tiny or irregularly shaped) without sacrificing the field of view. The first implements an ensemble of learning strategies with shared weights that aggregates the pixel-probability transfer at three orthogonal CT planes to ameliorate 3D information integrity; the second exploits the boundary loss that takes the form of a distance metric on the space of contours to mitigate the challenges of conventional region-based regularization, when applied to highly unbalanced segmentation scenarios. By combining the two techniques, enhanced segmentation could be expected by comprehensively maximizing inter- and intra-CT slice information. In total, 188 patients with HaN cancer were included in the study, of which 133 patients were randomly selected for training and 55 for validation. An additional 50 untreated cases were used for clinical evaluation.Results: With the proposed method, the average volumetric Dice similarity coefficient (DSC) of HaN OARs (and small organs) was 0.871 (0.900), which was significantly higher than the results from Ua-Net, Anatomy-Net, and SRM by 4.94% (26.05%), 7.80% (24.65%), and 12.97% (40.19%), respectively. By contrast, the average 95% Hausdorff distance (95% HD) of HaN OARs (and small organs) was 2.87 mm (0.81 mm), which improves the other three methods by 50.94% (75.45%), 88.41% (79.07%), and 5.59% (67.98%), respectively. After delineation by SEB-Net, 81.92% of all organs in 50 HaN cancer untreated cases did not require modification for clinical evaluation.Conclusions: In comparison to several cutting-edge methods, including Ua-Net, Anatomy-Net, and SRM, the proposed method is capable of substantially improving segmentation accuracy for HaN and small organs from CT imaging in terms of efficiency, feasibility, and applicability.
format article
author Wei Wang
Wei Wang
Qingxin Wang
Qingxin Wang
Qingxin Wang
Mengyu Jia
Zhongqiu Wang
Chengwen Yang
Chengwen Yang
Daguang Zhang
Daguang Zhang
Shujing Wen
Delong Hou
Ningbo Liu
Ningbo Liu
Ping Wang
Ping Wang
Jun Wang
Jun Wang
author_facet Wei Wang
Wei Wang
Qingxin Wang
Qingxin Wang
Qingxin Wang
Mengyu Jia
Zhongqiu Wang
Chengwen Yang
Chengwen Yang
Daguang Zhang
Daguang Zhang
Shujing Wen
Delong Hou
Ningbo Liu
Ningbo Liu
Ping Wang
Ping Wang
Jun Wang
Jun Wang
author_sort Wei Wang
title Deep Learning-Augmented Head and Neck Organs at Risk Segmentation From CT Volumes
title_short Deep Learning-Augmented Head and Neck Organs at Risk Segmentation From CT Volumes
title_full Deep Learning-Augmented Head and Neck Organs at Risk Segmentation From CT Volumes
title_fullStr Deep Learning-Augmented Head and Neck Organs at Risk Segmentation From CT Volumes
title_full_unstemmed Deep Learning-Augmented Head and Neck Organs at Risk Segmentation From CT Volumes
title_sort deep learning-augmented head and neck organs at risk segmentation from ct volumes
publisher Frontiers Media S.A.
publishDate 2021
url https://doaj.org/article/945aae3dad994147b8c34a2fa781bd23
work_keys_str_mv AT weiwang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT weiwang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT qingxinwang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT qingxinwang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT qingxinwang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT mengyujia deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT zhongqiuwang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT chengwenyang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT chengwenyang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT daguangzhang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT daguangzhang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT shujingwen deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT delonghou deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT ningboliu deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT ningboliu deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT pingwang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT pingwang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT junwang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
AT junwang deeplearningaugmentedheadandneckorgansatrisksegmentationfromctvolumes
_version_ 1718406518531948544