Myoelectric digit action decoding with multi-output, multi-class classification: an offline analysis

Abstract The ultimate goal of machine learning-based myoelectric control is simultaneous and independent control of multiple degrees of freedom (DOFs), including wrist and digit artificial joints. For prosthetic finger control, regression-based methods are typically used to reconstruct position/velo...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Agamemnon Krasoulis, Kianoush Nazarpour
Formato: article
Lenguaje:EN
Publicado: Nature Portfolio 2020
Materias:
R
Q
Acceso en línea:https://doaj.org/article/ae3e769ae891484ca5e2ce83d4b4cd4c
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:ae3e769ae891484ca5e2ce83d4b4cd4c
record_format dspace
spelling oai:doaj.org-article:ae3e769ae891484ca5e2ce83d4b4cd4c2021-12-02T18:37:06ZMyoelectric digit action decoding with multi-output, multi-class classification: an offline analysis10.1038/s41598-020-72574-72045-2322https://doaj.org/article/ae3e769ae891484ca5e2ce83d4b4cd4c2020-10-01T00:00:00Zhttps://doi.org/10.1038/s41598-020-72574-7https://doaj.org/toc/2045-2322Abstract The ultimate goal of machine learning-based myoelectric control is simultaneous and independent control of multiple degrees of freedom (DOFs), including wrist and digit artificial joints. For prosthetic finger control, regression-based methods are typically used to reconstruct position/velocity trajectories from surface electromyogram (EMG) signals. Unfortunately, such methods have thus far met with limited success. In this work, we propose action decoding, a paradigm-shifting approach for independent, multi-digit movement intent prediction based on multi-output, multi-class classification. At each moment in time, our algorithm decodes movement intent for each available DOF into one of three classes: open, close, or stall (i.e., no movement). Despite using a classifier as the decoder, arbitrary hand postures are possible with our approach. We analyse a public dataset previously recorded and published by us, comprising measurements from 10 able-bodied and two transradial amputee participants. We demonstrate the feasibility of using our proposed action decoding paradigm to predict movement action for all five digits as well as rotation of the thumb. We perform a systematic offline analysis by investigating the effect of various algorithmic parameters on decoding performance, such as feature selection and choice of classification algorithm and multi-output strategy. The outcomes of the offline analysis presented in this study will be used to inform the real-time implementation of our algorithm. In the future, we will further evaluate its efficacy with real-time control experiments involving upper-limb amputees.Agamemnon KrasoulisKianoush NazarpourNature PortfolioarticleMedicineRScienceQENScientific Reports, Vol 10, Iss 1, Pp 1-10 (2020)
institution DOAJ
collection DOAJ
language EN
topic Medicine
R
Science
Q
spellingShingle Medicine
R
Science
Q
Agamemnon Krasoulis
Kianoush Nazarpour
Myoelectric digit action decoding with multi-output, multi-class classification: an offline analysis
description Abstract The ultimate goal of machine learning-based myoelectric control is simultaneous and independent control of multiple degrees of freedom (DOFs), including wrist and digit artificial joints. For prosthetic finger control, regression-based methods are typically used to reconstruct position/velocity trajectories from surface electromyogram (EMG) signals. Unfortunately, such methods have thus far met with limited success. In this work, we propose action decoding, a paradigm-shifting approach for independent, multi-digit movement intent prediction based on multi-output, multi-class classification. At each moment in time, our algorithm decodes movement intent for each available DOF into one of three classes: open, close, or stall (i.e., no movement). Despite using a classifier as the decoder, arbitrary hand postures are possible with our approach. We analyse a public dataset previously recorded and published by us, comprising measurements from 10 able-bodied and two transradial amputee participants. We demonstrate the feasibility of using our proposed action decoding paradigm to predict movement action for all five digits as well as rotation of the thumb. We perform a systematic offline analysis by investigating the effect of various algorithmic parameters on decoding performance, such as feature selection and choice of classification algorithm and multi-output strategy. The outcomes of the offline analysis presented in this study will be used to inform the real-time implementation of our algorithm. In the future, we will further evaluate its efficacy with real-time control experiments involving upper-limb amputees.
format article
author Agamemnon Krasoulis
Kianoush Nazarpour
author_facet Agamemnon Krasoulis
Kianoush Nazarpour
author_sort Agamemnon Krasoulis
title Myoelectric digit action decoding with multi-output, multi-class classification: an offline analysis
title_short Myoelectric digit action decoding with multi-output, multi-class classification: an offline analysis
title_full Myoelectric digit action decoding with multi-output, multi-class classification: an offline analysis
title_fullStr Myoelectric digit action decoding with multi-output, multi-class classification: an offline analysis
title_full_unstemmed Myoelectric digit action decoding with multi-output, multi-class classification: an offline analysis
title_sort myoelectric digit action decoding with multi-output, multi-class classification: an offline analysis
publisher Nature Portfolio
publishDate 2020
url https://doaj.org/article/ae3e769ae891484ca5e2ce83d4b4cd4c
work_keys_str_mv AT agamemnonkrasoulis myoelectricdigitactiondecodingwithmultioutputmulticlassclassificationanofflineanalysis
AT kianoushnazarpour myoelectricdigitactiondecodingwithmultioutputmulticlassclassificationanofflineanalysis
_version_ 1718377820756901888