A deep-learned skin sensor decoding the epicentral human motions
Real-time monitoring human motions normally demands connecting a large number of sensors in a complicated network. To make it simpler, Kim et al. decode the motion of fingers using a flexible sensor attached on wrist that measures skin deformation with the help of a deep-learning architecture.
Guardado en:
Autores principales: | Kyun Kyu Kim, InHo Ha, Min Kim, Joonhwa Choi, Phillip Won, Sungho Jo, Seung Hwan Ko |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Nature Portfolio
2020
|
Materias: | |
Acceso en línea: | https://doaj.org/article/797f526273074c249f4dc388256638d3 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Biomimetic chameleon soft robot with artificial crypsis and disruptive coloration skin
por: Hyeonseok Kim, et al.
Publicado: (2021) -
Transparent wearable three-dimensional touch by self-generated multiscale structure
por: Kyun Kyu Kim, et al.
Publicado: (2019) -
Research in complex humanitarian emergencies: the Médecins Sans Frontières/Epicentre experience.
por: Vincent Brown, et al.
Publicado: (2008) -
Climate-induced elevational range shifts and increase in plant species richness in a Himalayan biodiversity epicentre.
por: Yasmeen Telwala, et al.
Publicado: (2013) -
Dyskinesia estimation during activities of daily living using wearable motion sensors and deep recurrent networks
por: Murtadha D. Hssayeni, et al.
Publicado: (2021)