Evaluation of an eye tracking setup for studying visual attention in face-to-face conversations

Abstract Many eye tracking studies use facial stimuli presented on a display to investigate attentional processing of social stimuli. To introduce a more realistic approach that allows interaction between two real people, we evaluated a new eye tracking setup in three independent studies in terms of...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Antonia Vehlen, Ines Spenthof, Daniel Tönsing, Markus Heinrichs, Gregor Domes
Formato: article
Lenguaje:EN
Publicado: Nature Portfolio 2021
Materias:
R
Q
Acceso en línea:https://doaj.org/article/21aee6ab7d694ac99daf21826d70d0ce
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:21aee6ab7d694ac99daf21826d70d0ce
record_format dspace
spelling oai:doaj.org-article:21aee6ab7d694ac99daf21826d70d0ce2021-12-02T13:24:07ZEvaluation of an eye tracking setup for studying visual attention in face-to-face conversations10.1038/s41598-021-81987-x2045-2322https://doaj.org/article/21aee6ab7d694ac99daf21826d70d0ce2021-01-01T00:00:00Zhttps://doi.org/10.1038/s41598-021-81987-xhttps://doaj.org/toc/2045-2322Abstract Many eye tracking studies use facial stimuli presented on a display to investigate attentional processing of social stimuli. To introduce a more realistic approach that allows interaction between two real people, we evaluated a new eye tracking setup in three independent studies in terms of data quality, short-term reliability and feasibility. Study 1 measured the robustness, precision and accuracy for calibration stimuli compared to a classical display-based setup. Study 2 used the identical measures with an independent study sample to compare the data quality for a photograph of a face (2D) and the face of the real person (3D). Study 3 evaluated data quality over the course of a real face-to-face conversation and examined the gaze behavior on the facial features of the conversation partner. Study 1 provides evidence that quality indices for the scene-based setup were comparable to those of a classical display-based setup. Average accuracy was better than 0.4° visual angle. Study 2 demonstrates that eye tracking quality is sufficient for 3D stimuli and robust against short interruptions without re-calibration. Study 3 confirms the long-term stability of tracking accuracy during a face-to-face interaction and demonstrates typical gaze patterns for facial features. Thus, the eye tracking setup presented here seems feasible for studying gaze behavior in dyadic face-to-face interactions. Eye tracking data obtained with this setup achieves an accuracy that is sufficient for investigating behavior such as eye contact in social interactions in a range of populations including clinical conditions, such as autism spectrum and social phobia.Antonia VehlenInes SpenthofDaniel TönsingMarkus HeinrichsGregor DomesNature PortfolioarticleMedicineRScienceQENScientific Reports, Vol 11, Iss 1, Pp 1-16 (2021)
institution DOAJ
collection DOAJ
language EN
topic Medicine
R
Science
Q
spellingShingle Medicine
R
Science
Q
Antonia Vehlen
Ines Spenthof
Daniel Tönsing
Markus Heinrichs
Gregor Domes
Evaluation of an eye tracking setup for studying visual attention in face-to-face conversations
description Abstract Many eye tracking studies use facial stimuli presented on a display to investigate attentional processing of social stimuli. To introduce a more realistic approach that allows interaction between two real people, we evaluated a new eye tracking setup in three independent studies in terms of data quality, short-term reliability and feasibility. Study 1 measured the robustness, precision and accuracy for calibration stimuli compared to a classical display-based setup. Study 2 used the identical measures with an independent study sample to compare the data quality for a photograph of a face (2D) and the face of the real person (3D). Study 3 evaluated data quality over the course of a real face-to-face conversation and examined the gaze behavior on the facial features of the conversation partner. Study 1 provides evidence that quality indices for the scene-based setup were comparable to those of a classical display-based setup. Average accuracy was better than 0.4° visual angle. Study 2 demonstrates that eye tracking quality is sufficient for 3D stimuli and robust against short interruptions without re-calibration. Study 3 confirms the long-term stability of tracking accuracy during a face-to-face interaction and demonstrates typical gaze patterns for facial features. Thus, the eye tracking setup presented here seems feasible for studying gaze behavior in dyadic face-to-face interactions. Eye tracking data obtained with this setup achieves an accuracy that is sufficient for investigating behavior such as eye contact in social interactions in a range of populations including clinical conditions, such as autism spectrum and social phobia.
format article
author Antonia Vehlen
Ines Spenthof
Daniel Tönsing
Markus Heinrichs
Gregor Domes
author_facet Antonia Vehlen
Ines Spenthof
Daniel Tönsing
Markus Heinrichs
Gregor Domes
author_sort Antonia Vehlen
title Evaluation of an eye tracking setup for studying visual attention in face-to-face conversations
title_short Evaluation of an eye tracking setup for studying visual attention in face-to-face conversations
title_full Evaluation of an eye tracking setup for studying visual attention in face-to-face conversations
title_fullStr Evaluation of an eye tracking setup for studying visual attention in face-to-face conversations
title_full_unstemmed Evaluation of an eye tracking setup for studying visual attention in face-to-face conversations
title_sort evaluation of an eye tracking setup for studying visual attention in face-to-face conversations
publisher Nature Portfolio
publishDate 2021
url https://doaj.org/article/21aee6ab7d694ac99daf21826d70d0ce
work_keys_str_mv AT antoniavehlen evaluationofaneyetrackingsetupforstudyingvisualattentioninfacetofaceconversations
AT inesspenthof evaluationofaneyetrackingsetupforstudyingvisualattentioninfacetofaceconversations
AT danieltonsing evaluationofaneyetrackingsetupforstudyingvisualattentioninfacetofaceconversations
AT markusheinrichs evaluationofaneyetrackingsetupforstudyingvisualattentioninfacetofaceconversations
AT gregordomes evaluationofaneyetrackingsetupforstudyingvisualattentioninfacetofaceconversations
_version_ 1718393136061874176