See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions
Severely motor-disabled patients, such as those suffering from the so-called “locked-in” syndrome, cannot communicate naturally. They may benefit from brain-computer interfaces (BCIs) exploiting brain signals for communication and therewith circumventing the muscular system. One BCI technique that h...
Guardado en:
Autores principales: | , , , , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Frontiers Media S.A.
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/065239256e3040dbab5cf004d4b85dfc |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:065239256e3040dbab5cf004d4b85dfc |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:065239256e3040dbab5cf004d4b85dfc2021-12-01T02:42:38ZSee, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions1662-516110.3389/fnhum.2021.784522https://doaj.org/article/065239256e3040dbab5cf004d4b85dfc2021-11-01T00:00:00Zhttps://www.frontiersin.org/articles/10.3389/fnhum.2021.784522/fullhttps://doaj.org/toc/1662-5161Severely motor-disabled patients, such as those suffering from the so-called “locked-in” syndrome, cannot communicate naturally. They may benefit from brain-computer interfaces (BCIs) exploiting brain signals for communication and therewith circumventing the muscular system. One BCI technique that has gained attention recently is functional near-infrared spectroscopy (fNIRS). Typically, fNIRS-based BCIs allow for brain-based communication via voluntarily modulation of brain activity through mental task performance guided by visual or auditory instructions. While the development of fNIRS-BCIs has made great progress, the reliability of fNIRS-BCIs across time and environments has rarely been assessed. In the present fNIRS-BCI study, we tested six healthy participants across three consecutive days using a straightforward four-choice fNIRS-BCI communication paradigm that allows answer encoding based on instructions using various sensory modalities. To encode an answer, participants performed a motor imagery task (mental drawing) in one out of four time periods. Answer encoding was guided by either the visual, auditory, or tactile sensory modality. Two participants were tested outside the laboratory in a cafeteria. Answers were decoded from the time course of the most-informative fNIRS channel-by-chromophore combination. Across the three testing days, we obtained mean single- and multi-trial (joint analysis of four consecutive trials) accuracies of 62.5 and 85.19%, respectively. Obtained multi-trial accuracies were 86.11% for visual, 80.56% for auditory, and 88.89% for tactile sensory encoding. The two participants that used the fNIRS-BCI in a cafeteria obtained the best single- (72.22 and 77.78%) and multi-trial accuracies (100 and 94.44%). Communication was reliable over the three recording sessions with multi-trial accuracies of 86.11% on day 1, 86.11% on day 2, and 83.33% on day 3. To gauge the trade-off between number of optodes and decoding accuracy, averaging across two and three promising fNIRS channels was compared to the one-channel approach. Multi-trial accuracy increased from 85.19% (one-channel approach) to 91.67% (two-/three-channel approach). In sum, the presented fNIRS-BCI yielded robust decoding results using three alternative sensory encoding modalities. Further, fNIRS-BCI communication was stable over the course of three consecutive days, even in a natural (social) environment. Therewith, the developed fNIRS-BCI demonstrated high flexibility, reliability and robustness, crucial requirements for future clinical applicability.Laurien Nagels-CouneLaurien Nagels-CouneLaurien Nagels-CouneLars RieckeLars RieckeAmaia Benitez-AndoneguiAmaia Benitez-AndoneguiAmaia Benitez-AndoneguiSimona KlinkhammerSimona KlinkhammerRainer GoebelRainer GoebelRainer GoebelPeter De WeerdPeter De WeerdPeter De WeerdMichael LührsBettina SorgerBettina SorgerFrontiers Media S.A.articlefunctional near-infrared spectroscopy (fNIRS)brain-computer interface (BCI)motor imagery (MI)mental drawingsensory encoding modalityfour-choice communicationNeurosciences. Biological psychiatry. NeuropsychiatryRC321-571ENFrontiers in Human Neuroscience, Vol 15 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
functional near-infrared spectroscopy (fNIRS) brain-computer interface (BCI) motor imagery (MI) mental drawing sensory encoding modality four-choice communication Neurosciences. Biological psychiatry. Neuropsychiatry RC321-571 |
spellingShingle |
functional near-infrared spectroscopy (fNIRS) brain-computer interface (BCI) motor imagery (MI) mental drawing sensory encoding modality four-choice communication Neurosciences. Biological psychiatry. Neuropsychiatry RC321-571 Laurien Nagels-Coune Laurien Nagels-Coune Laurien Nagels-Coune Lars Riecke Lars Riecke Amaia Benitez-Andonegui Amaia Benitez-Andonegui Amaia Benitez-Andonegui Simona Klinkhammer Simona Klinkhammer Rainer Goebel Rainer Goebel Rainer Goebel Peter De Weerd Peter De Weerd Peter De Weerd Michael Lührs Bettina Sorger Bettina Sorger See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions |
description |
Severely motor-disabled patients, such as those suffering from the so-called “locked-in” syndrome, cannot communicate naturally. They may benefit from brain-computer interfaces (BCIs) exploiting brain signals for communication and therewith circumventing the muscular system. One BCI technique that has gained attention recently is functional near-infrared spectroscopy (fNIRS). Typically, fNIRS-based BCIs allow for brain-based communication via voluntarily modulation of brain activity through mental task performance guided by visual or auditory instructions. While the development of fNIRS-BCIs has made great progress, the reliability of fNIRS-BCIs across time and environments has rarely been assessed. In the present fNIRS-BCI study, we tested six healthy participants across three consecutive days using a straightforward four-choice fNIRS-BCI communication paradigm that allows answer encoding based on instructions using various sensory modalities. To encode an answer, participants performed a motor imagery task (mental drawing) in one out of four time periods. Answer encoding was guided by either the visual, auditory, or tactile sensory modality. Two participants were tested outside the laboratory in a cafeteria. Answers were decoded from the time course of the most-informative fNIRS channel-by-chromophore combination. Across the three testing days, we obtained mean single- and multi-trial (joint analysis of four consecutive trials) accuracies of 62.5 and 85.19%, respectively. Obtained multi-trial accuracies were 86.11% for visual, 80.56% for auditory, and 88.89% for tactile sensory encoding. The two participants that used the fNIRS-BCI in a cafeteria obtained the best single- (72.22 and 77.78%) and multi-trial accuracies (100 and 94.44%). Communication was reliable over the three recording sessions with multi-trial accuracies of 86.11% on day 1, 86.11% on day 2, and 83.33% on day 3. To gauge the trade-off between number of optodes and decoding accuracy, averaging across two and three promising fNIRS channels was compared to the one-channel approach. Multi-trial accuracy increased from 85.19% (one-channel approach) to 91.67% (two-/three-channel approach). In sum, the presented fNIRS-BCI yielded robust decoding results using three alternative sensory encoding modalities. Further, fNIRS-BCI communication was stable over the course of three consecutive days, even in a natural (social) environment. Therewith, the developed fNIRS-BCI demonstrated high flexibility, reliability and robustness, crucial requirements for future clinical applicability. |
format |
article |
author |
Laurien Nagels-Coune Laurien Nagels-Coune Laurien Nagels-Coune Lars Riecke Lars Riecke Amaia Benitez-Andonegui Amaia Benitez-Andonegui Amaia Benitez-Andonegui Simona Klinkhammer Simona Klinkhammer Rainer Goebel Rainer Goebel Rainer Goebel Peter De Weerd Peter De Weerd Peter De Weerd Michael Lührs Bettina Sorger Bettina Sorger |
author_facet |
Laurien Nagels-Coune Laurien Nagels-Coune Laurien Nagels-Coune Lars Riecke Lars Riecke Amaia Benitez-Andonegui Amaia Benitez-Andonegui Amaia Benitez-Andonegui Simona Klinkhammer Simona Klinkhammer Rainer Goebel Rainer Goebel Rainer Goebel Peter De Weerd Peter De Weerd Peter De Weerd Michael Lührs Bettina Sorger Bettina Sorger |
author_sort |
Laurien Nagels-Coune |
title |
See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions |
title_short |
See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions |
title_full |
See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions |
title_fullStr |
See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions |
title_full_unstemmed |
See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions |
title_sort |
see, hear, or feel – to speak: a versatile multiple-choice functional near-infrared spectroscopy-brain-computer interface feasible with visual, auditory, or tactile instructions |
publisher |
Frontiers Media S.A. |
publishDate |
2021 |
url |
https://doaj.org/article/065239256e3040dbab5cf004d4b85dfc |
work_keys_str_mv |
AT lauriennagelscoune seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT lauriennagelscoune seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT lauriennagelscoune seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT larsriecke seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT larsriecke seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT amaiabenitezandonegui seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT amaiabenitezandonegui seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT amaiabenitezandonegui seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT simonaklinkhammer seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT simonaklinkhammer seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT rainergoebel seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT rainergoebel seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT rainergoebel seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT peterdeweerd seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT peterdeweerd seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT peterdeweerd seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT michaelluhrs seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT bettinasorger seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT bettinasorger seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions |
_version_ |
1718405883237498880 |