Emotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses
We present a glasses-type wearable device to detect emotions from a human face in an unobtrusive manner. The device is designed to gather multi-channel responses from the user’s face naturally and continuously while he/she is wearing it. The multi-channel facial responses consist of local...
Guardado en:
Autores principales: | , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
IEEE
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/f1d1495afbd147e6b3182ed1a3df8859 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:f1d1495afbd147e6b3182ed1a3df8859 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:f1d1495afbd147e6b3182ed1a3df88592021-11-09T00:03:15ZEmotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses2169-353610.1109/ACCESS.2021.3121543https://doaj.org/article/f1d1495afbd147e6b3182ed1a3df88592021-01-01T00:00:00Zhttps://ieeexplore.ieee.org/document/9580894/https://doaj.org/toc/2169-3536We present a glasses-type wearable device to detect emotions from a human face in an unobtrusive manner. The device is designed to gather multi-channel responses from the user’s face naturally and continuously while he/she is wearing it. The multi-channel facial responses consist of local facial images and biosignals including electrodermal activity (EDA) and photoplethysmogram (PPG). We had conducted experiments to determine the optimal positions of EDA sensors on the wearable device because EDA signal quality is very sensitive to the sensing position. In addition to the physiological data, the device can capture the image region representing local facial expressions around the left eye via a built-in camera. In this study, we developed and validated an algorithm to recognize emotions using multi-channel responses obtained from the device. The results show that the emotion recognition algorithm using only local facial images has an accuracy of 76.09% at classifying emotions. Using multi-channel data including EDA and PPG, this accuracy was increased by 8.46% compared to using the local facial expression alone. This glasses-type wearable system measuring multi-channel facial responses in a natural manner is very useful for monitoring a user’s emotions in daily life, which has a huge potential for use in the healthcare industry.Jangho KwonJihyeon HaDa-Hye KimJun Won ChoiLaehyun KimIEEEarticleWearable deviceemotion recognitionaffective computingfacial expressionbiosignalphysiological responsesElectrical engineering. Electronics. Nuclear engineeringTK1-9971ENIEEE Access, Vol 9, Pp 146392-146403 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Wearable device emotion recognition affective computing facial expression biosignal physiological responses Electrical engineering. Electronics. Nuclear engineering TK1-9971 |
spellingShingle |
Wearable device emotion recognition affective computing facial expression biosignal physiological responses Electrical engineering. Electronics. Nuclear engineering TK1-9971 Jangho Kwon Jihyeon Ha Da-Hye Kim Jun Won Choi Laehyun Kim Emotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses |
description |
We present a glasses-type wearable device to detect emotions from a human face in an unobtrusive manner. The device is designed to gather multi-channel responses from the user’s face naturally and continuously while he/she is wearing it. The multi-channel facial responses consist of local facial images and biosignals including electrodermal activity (EDA) and photoplethysmogram (PPG). We had conducted experiments to determine the optimal positions of EDA sensors on the wearable device because EDA signal quality is very sensitive to the sensing position. In addition to the physiological data, the device can capture the image region representing local facial expressions around the left eye via a built-in camera. In this study, we developed and validated an algorithm to recognize emotions using multi-channel responses obtained from the device. The results show that the emotion recognition algorithm using only local facial images has an accuracy of 76.09% at classifying emotions. Using multi-channel data including EDA and PPG, this accuracy was increased by 8.46% compared to using the local facial expression alone. This glasses-type wearable system measuring multi-channel facial responses in a natural manner is very useful for monitoring a user’s emotions in daily life, which has a huge potential for use in the healthcare industry. |
format |
article |
author |
Jangho Kwon Jihyeon Ha Da-Hye Kim Jun Won Choi Laehyun Kim |
author_facet |
Jangho Kwon Jihyeon Ha Da-Hye Kim Jun Won Choi Laehyun Kim |
author_sort |
Jangho Kwon |
title |
Emotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses |
title_short |
Emotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses |
title_full |
Emotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses |
title_fullStr |
Emotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses |
title_full_unstemmed |
Emotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses |
title_sort |
emotion recognition using a glasses-type wearable device via multi-channel facial responses |
publisher |
IEEE |
publishDate |
2021 |
url |
https://doaj.org/article/f1d1495afbd147e6b3182ed1a3df8859 |
work_keys_str_mv |
AT janghokwon emotionrecognitionusingaglassestypewearabledeviceviamultichannelfacialresponses AT jihyeonha emotionrecognitionusingaglassestypewearabledeviceviamultichannelfacialresponses AT dahyekim emotionrecognitionusingaglassestypewearabledeviceviamultichannelfacialresponses AT junwonchoi emotionrecognitionusingaglassestypewearabledeviceviamultichannelfacialresponses AT laehyunkim emotionrecognitionusingaglassestypewearabledeviceviamultichannelfacialresponses |
_version_ |
1718441439288885248 |