A deep transfer learning approach for wearable sleep stage classification with photoplethysmography

Abstract Unobtrusive home sleep monitoring using wrist-worn wearable photoplethysmography (PPG) could open the way for better sleep disorder screening and health monitoring. However, PPG is rarely included in large sleep studies with gold-standard sleep annotation from polysomnography. Therefore, tr...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Mustafa Radha, Pedro Fonseca, Arnaud Moreau, Marco Ross, Andreas Cerny, Peter Anderer, Xi Long, Ronald M. Aarts
Formato: article
Lenguaje:EN
Publicado: Nature Portfolio 2021
Materias:
Acceso en línea:https://doaj.org/article/fec60914d8ef48c2a357f505610bfa45
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:fec60914d8ef48c2a357f505610bfa45
record_format dspace
spelling oai:doaj.org-article:fec60914d8ef48c2a357f505610bfa452021-12-02T18:33:53ZA deep transfer learning approach for wearable sleep stage classification with photoplethysmography10.1038/s41746-021-00510-82398-6352https://doaj.org/article/fec60914d8ef48c2a357f505610bfa452021-09-01T00:00:00Zhttps://doi.org/10.1038/s41746-021-00510-8https://doaj.org/toc/2398-6352Abstract Unobtrusive home sleep monitoring using wrist-worn wearable photoplethysmography (PPG) could open the way for better sleep disorder screening and health monitoring. However, PPG is rarely included in large sleep studies with gold-standard sleep annotation from polysomnography. Therefore, training data-intensive state-of-the-art deep neural networks is challenging. In this work a deep recurrent neural network is first trained using a large sleep data set with electrocardiogram (ECG) data (292 participants, 584 recordings) to perform 4-class sleep stage classification (wake, rapid-eye-movement, N1/N2, and N3). A small part of its weights is adapted to a smaller, newer PPG data set (60 healthy participants, 101 recordings) through three variations of transfer learning. Best results (Cohen’s kappa of 0.65 ± 0.11, accuracy of 76.36 ± 7.57%) were achieved with the domain and decision combined transfer learning strategy, significantly outperforming the PPG-trained and ECG-trained baselines. This performance for PPG-based 4-class sleep stage classification is unprecedented in literature, bringing home sleep stage monitoring closer to clinical use. The work demonstrates the merit of transfer learning in developing reliable methods for new sensor technologies by reusing similar, older non-wearable data sets. Further study should evaluate our approach in patients with sleep disorders such as insomnia and sleep apnoea.Mustafa RadhaPedro FonsecaArnaud MoreauMarco RossAndreas CernyPeter AndererXi LongRonald M. AartsNature PortfolioarticleComputer applications to medicine. Medical informaticsR858-859.7ENnpj Digital Medicine, Vol 4, Iss 1, Pp 1-11 (2021)
institution DOAJ
collection DOAJ
language EN
topic Computer applications to medicine. Medical informatics
R858-859.7
spellingShingle Computer applications to medicine. Medical informatics
R858-859.7
Mustafa Radha
Pedro Fonseca
Arnaud Moreau
Marco Ross
Andreas Cerny
Peter Anderer
Xi Long
Ronald M. Aarts
A deep transfer learning approach for wearable sleep stage classification with photoplethysmography
description Abstract Unobtrusive home sleep monitoring using wrist-worn wearable photoplethysmography (PPG) could open the way for better sleep disorder screening and health monitoring. However, PPG is rarely included in large sleep studies with gold-standard sleep annotation from polysomnography. Therefore, training data-intensive state-of-the-art deep neural networks is challenging. In this work a deep recurrent neural network is first trained using a large sleep data set with electrocardiogram (ECG) data (292 participants, 584 recordings) to perform 4-class sleep stage classification (wake, rapid-eye-movement, N1/N2, and N3). A small part of its weights is adapted to a smaller, newer PPG data set (60 healthy participants, 101 recordings) through three variations of transfer learning. Best results (Cohen’s kappa of 0.65 ± 0.11, accuracy of 76.36 ± 7.57%) were achieved with the domain and decision combined transfer learning strategy, significantly outperforming the PPG-trained and ECG-trained baselines. This performance for PPG-based 4-class sleep stage classification is unprecedented in literature, bringing home sleep stage monitoring closer to clinical use. The work demonstrates the merit of transfer learning in developing reliable methods for new sensor technologies by reusing similar, older non-wearable data sets. Further study should evaluate our approach in patients with sleep disorders such as insomnia and sleep apnoea.
format article
author Mustafa Radha
Pedro Fonseca
Arnaud Moreau
Marco Ross
Andreas Cerny
Peter Anderer
Xi Long
Ronald M. Aarts
author_facet Mustafa Radha
Pedro Fonseca
Arnaud Moreau
Marco Ross
Andreas Cerny
Peter Anderer
Xi Long
Ronald M. Aarts
author_sort Mustafa Radha
title A deep transfer learning approach for wearable sleep stage classification with photoplethysmography
title_short A deep transfer learning approach for wearable sleep stage classification with photoplethysmography
title_full A deep transfer learning approach for wearable sleep stage classification with photoplethysmography
title_fullStr A deep transfer learning approach for wearable sleep stage classification with photoplethysmography
title_full_unstemmed A deep transfer learning approach for wearable sleep stage classification with photoplethysmography
title_sort deep transfer learning approach for wearable sleep stage classification with photoplethysmography
publisher Nature Portfolio
publishDate 2021
url https://doaj.org/article/fec60914d8ef48c2a357f505610bfa45
work_keys_str_mv AT mustafaradha adeeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT pedrofonseca adeeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT arnaudmoreau adeeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT marcoross adeeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT andreascerny adeeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT peteranderer adeeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT xilong adeeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT ronaldmaarts adeeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT mustafaradha deeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT pedrofonseca deeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT arnaudmoreau deeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT marcoross deeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT andreascerny deeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT peteranderer deeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT xilong deeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
AT ronaldmaarts deeptransferlearningapproachforwearablesleepstageclassificationwithphotoplethysmography
_version_ 1718377958283935744