Multimodal fusion with deep neural networks for leveraging CT imaging and electronic health record: a case-study in pulmonary embolism detection
Abstract Recent advancements in deep learning have led to a resurgence of medical imaging and Electronic Medical Record (EMR) models for a variety of applications, including clinical decision support, automated workflow triage, clinical prediction and more. However, very few models have been develop...
Guardado en:
Autores principales: | , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Nature Portfolio
2020
|
Materias: | |
Acceso en línea: | https://doaj.org/article/ba490b3f075f4d0aa5669e0e51d909ba |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:ba490b3f075f4d0aa5669e0e51d909ba |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:ba490b3f075f4d0aa5669e0e51d909ba2021-12-02T13:58:12ZMultimodal fusion with deep neural networks for leveraging CT imaging and electronic health record: a case-study in pulmonary embolism detection10.1038/s41598-020-78888-w2045-2322https://doaj.org/article/ba490b3f075f4d0aa5669e0e51d909ba2020-12-01T00:00:00Zhttps://doi.org/10.1038/s41598-020-78888-whttps://doaj.org/toc/2045-2322Abstract Recent advancements in deep learning have led to a resurgence of medical imaging and Electronic Medical Record (EMR) models for a variety of applications, including clinical decision support, automated workflow triage, clinical prediction and more. However, very few models have been developed to integrate both clinical and imaging data, despite that in routine practice clinicians rely on EMR to provide context in medical imaging interpretation. In this study, we developed and compared different multimodal fusion model architectures that are capable of utilizing both pixel data from volumetric Computed Tomography Pulmonary Angiography scans and clinical patient data from the EMR to automatically classify Pulmonary Embolism (PE) cases. The best performing multimodality model is a late fusion model that achieves an AUROC of 0.947 [95% CI: 0.946–0.948] on the entire held-out test set, outperforming imaging-only and EMR-only single modality models.Shih-Cheng HuangAnuj PareekRoham ZamanianImon BanerjeeMatthew P. LungrenNature PortfolioarticleMedicineRScienceQENScientific Reports, Vol 10, Iss 1, Pp 1-9 (2020) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Medicine R Science Q |
spellingShingle |
Medicine R Science Q Shih-Cheng Huang Anuj Pareek Roham Zamanian Imon Banerjee Matthew P. Lungren Multimodal fusion with deep neural networks for leveraging CT imaging and electronic health record: a case-study in pulmonary embolism detection |
description |
Abstract Recent advancements in deep learning have led to a resurgence of medical imaging and Electronic Medical Record (EMR) models for a variety of applications, including clinical decision support, automated workflow triage, clinical prediction and more. However, very few models have been developed to integrate both clinical and imaging data, despite that in routine practice clinicians rely on EMR to provide context in medical imaging interpretation. In this study, we developed and compared different multimodal fusion model architectures that are capable of utilizing both pixel data from volumetric Computed Tomography Pulmonary Angiography scans and clinical patient data from the EMR to automatically classify Pulmonary Embolism (PE) cases. The best performing multimodality model is a late fusion model that achieves an AUROC of 0.947 [95% CI: 0.946–0.948] on the entire held-out test set, outperforming imaging-only and EMR-only single modality models. |
format |
article |
author |
Shih-Cheng Huang Anuj Pareek Roham Zamanian Imon Banerjee Matthew P. Lungren |
author_facet |
Shih-Cheng Huang Anuj Pareek Roham Zamanian Imon Banerjee Matthew P. Lungren |
author_sort |
Shih-Cheng Huang |
title |
Multimodal fusion with deep neural networks for leveraging CT imaging and electronic health record: a case-study in pulmonary embolism detection |
title_short |
Multimodal fusion with deep neural networks for leveraging CT imaging and electronic health record: a case-study in pulmonary embolism detection |
title_full |
Multimodal fusion with deep neural networks for leveraging CT imaging and electronic health record: a case-study in pulmonary embolism detection |
title_fullStr |
Multimodal fusion with deep neural networks for leveraging CT imaging and electronic health record: a case-study in pulmonary embolism detection |
title_full_unstemmed |
Multimodal fusion with deep neural networks for leveraging CT imaging and electronic health record: a case-study in pulmonary embolism detection |
title_sort |
multimodal fusion with deep neural networks for leveraging ct imaging and electronic health record: a case-study in pulmonary embolism detection |
publisher |
Nature Portfolio |
publishDate |
2020 |
url |
https://doaj.org/article/ba490b3f075f4d0aa5669e0e51d909ba |
work_keys_str_mv |
AT shihchenghuang multimodalfusionwithdeepneuralnetworksforleveragingctimagingandelectronichealthrecordacasestudyinpulmonaryembolismdetection AT anujpareek multimodalfusionwithdeepneuralnetworksforleveragingctimagingandelectronichealthrecordacasestudyinpulmonaryembolismdetection AT rohamzamanian multimodalfusionwithdeepneuralnetworksforleveragingctimagingandelectronichealthrecordacasestudyinpulmonaryembolismdetection AT imonbanerjee multimodalfusionwithdeepneuralnetworksforleveragingctimagingandelectronichealthrecordacasestudyinpulmonaryembolismdetection AT matthewplungren multimodalfusionwithdeepneuralnetworksforleveragingctimagingandelectronichealthrecordacasestudyinpulmonaryembolismdetection |
_version_ |
1718392224042975232 |