Real-time synthesis of imagined speech processes from minimally invasive recordings of neural activity

Miguel Angrick et al. develop an intracranial EEG-based method to decode imagined speech from a human patient and translate it into audible speech in real-time. This report presents an important proof of concept that acoustic output can be reconstructed on the basis of neural signals, and serves as...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Miguel Angrick, Maarten C. Ottenhoff, Lorenz Diener, Darius Ivucic, Gabriel Ivucic, Sophocles Goulis, Jeremy Saal, Albert J. Colon, Louis Wagner, Dean J. Krusienski, Pieter L. Kubben, Tanja Schultz, Christian Herff
Formato: article
Lenguaje:EN
Publicado: Nature Portfolio 2021
Materias:
Acceso en línea:https://doaj.org/article/6889a47970bf4c32a7cfd367930fc291
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:6889a47970bf4c32a7cfd367930fc291
record_format dspace
spelling oai:doaj.org-article:6889a47970bf4c32a7cfd367930fc2912021-12-02T15:15:13ZReal-time synthesis of imagined speech processes from minimally invasive recordings of neural activity10.1038/s42003-021-02578-02399-3642https://doaj.org/article/6889a47970bf4c32a7cfd367930fc2912021-09-01T00:00:00Zhttps://doi.org/10.1038/s42003-021-02578-0https://doaj.org/toc/2399-3642Miguel Angrick et al. develop an intracranial EEG-based method to decode imagined speech from a human patient and translate it into audible speech in real-time. This report presents an important proof of concept that acoustic output can be reconstructed on the basis of neural signals, and serves as a valuable step in the development of neuroprostheses to help nonverbal patients interact with their environment.Miguel AngrickMaarten C. OttenhoffLorenz DienerDarius IvucicGabriel IvucicSophocles GoulisJeremy SaalAlbert J. ColonLouis WagnerDean J. KrusienskiPieter L. KubbenTanja SchultzChristian HerffNature PortfolioarticleBiology (General)QH301-705.5ENCommunications Biology, Vol 4, Iss 1, Pp 1-10 (2021)
institution DOAJ
collection DOAJ
language EN
topic Biology (General)
QH301-705.5
spellingShingle Biology (General)
QH301-705.5
Miguel Angrick
Maarten C. Ottenhoff
Lorenz Diener
Darius Ivucic
Gabriel Ivucic
Sophocles Goulis
Jeremy Saal
Albert J. Colon
Louis Wagner
Dean J. Krusienski
Pieter L. Kubben
Tanja Schultz
Christian Herff
Real-time synthesis of imagined speech processes from minimally invasive recordings of neural activity
description Miguel Angrick et al. develop an intracranial EEG-based method to decode imagined speech from a human patient and translate it into audible speech in real-time. This report presents an important proof of concept that acoustic output can be reconstructed on the basis of neural signals, and serves as a valuable step in the development of neuroprostheses to help nonverbal patients interact with their environment.
format article
author Miguel Angrick
Maarten C. Ottenhoff
Lorenz Diener
Darius Ivucic
Gabriel Ivucic
Sophocles Goulis
Jeremy Saal
Albert J. Colon
Louis Wagner
Dean J. Krusienski
Pieter L. Kubben
Tanja Schultz
Christian Herff
author_facet Miguel Angrick
Maarten C. Ottenhoff
Lorenz Diener
Darius Ivucic
Gabriel Ivucic
Sophocles Goulis
Jeremy Saal
Albert J. Colon
Louis Wagner
Dean J. Krusienski
Pieter L. Kubben
Tanja Schultz
Christian Herff
author_sort Miguel Angrick
title Real-time synthesis of imagined speech processes from minimally invasive recordings of neural activity
title_short Real-time synthesis of imagined speech processes from minimally invasive recordings of neural activity
title_full Real-time synthesis of imagined speech processes from minimally invasive recordings of neural activity
title_fullStr Real-time synthesis of imagined speech processes from minimally invasive recordings of neural activity
title_full_unstemmed Real-time synthesis of imagined speech processes from minimally invasive recordings of neural activity
title_sort real-time synthesis of imagined speech processes from minimally invasive recordings of neural activity
publisher Nature Portfolio
publishDate 2021
url https://doaj.org/article/6889a47970bf4c32a7cfd367930fc291
work_keys_str_mv AT miguelangrick realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT maartencottenhoff realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT lorenzdiener realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT dariusivucic realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT gabrielivucic realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT sophoclesgoulis realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT jeremysaal realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT albertjcolon realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT louiswagner realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT deanjkrusienski realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT pieterlkubben realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT tanjaschultz realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
AT christianherff realtimesynthesisofimaginedspeechprocessesfromminimallyinvasiverecordingsofneuralactivity
_version_ 1718387573930328064