How bodies and voices interact in early emotion perception.

Successful social communication draws strongly on the correct interpretation of others' body and vocal expressions. Both can provide emotional information and often occur simultaneously. Yet their interplay has hardly been studied. Using electroencephalography, we investigated the temporal deve...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Sarah Jessen, Jonas Obleser, Sonja A Kotz
Formato: article
Lenguaje:EN
Publicado: Public Library of Science (PLoS) 2012
Materias:
R
Q
Acceso en línea:https://doaj.org/article/cd986d4d47fd408fb55a2837bcebf9f8
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:cd986d4d47fd408fb55a2837bcebf9f8
record_format dspace
spelling oai:doaj.org-article:cd986d4d47fd408fb55a2837bcebf9f82021-11-18T07:20:12ZHow bodies and voices interact in early emotion perception.1932-620310.1371/journal.pone.0036070https://doaj.org/article/cd986d4d47fd408fb55a2837bcebf9f82012-01-01T00:00:00Zhttps://www.ncbi.nlm.nih.gov/pmc/articles/pmid/22558332/pdf/?tool=EBIhttps://doaj.org/toc/1932-6203Successful social communication draws strongly on the correct interpretation of others' body and vocal expressions. Both can provide emotional information and often occur simultaneously. Yet their interplay has hardly been studied. Using electroencephalography, we investigated the temporal development underlying their neural interaction in auditory and visual perception. In particular, we tested whether this interaction qualifies as true integration following multisensory integration principles such as inverse effectiveness. Emotional vocalizations were embedded in either low or high levels of noise and presented with or without video clips of matching emotional body expressions. In both, high and low noise conditions, a reduction in auditory N100 amplitude was observed for audiovisual stimuli. However, only under high noise, the N100 peaked earlier in the audiovisual than the auditory condition, suggesting facilitatory effects as predicted by the inverse effectiveness principle. Similarly, we observed earlier N100 peaks in response to emotional compared to neutral audiovisual stimuli. This was not the case in the unimodal auditory condition. Furthermore, suppression of beta-band oscillations (15-25 Hz) primarily reflecting biological motion perception was modulated 200-400 ms after the vocalization. While larger differences in suppression between audiovisual and audio stimuli in high compared to low noise levels were found for emotional stimuli, no such difference was observed for neutral stimuli. This observation is in accordance with the inverse effectiveness principle and suggests a modulation of integration by emotional content. Overall, results show that ecologically valid, complex stimuli such as joined body and vocal expressions are effectively integrated very early in processing.Sarah JessenJonas ObleserSonja A KotzPublic Library of Science (PLoS)articleMedicineRScienceQENPLoS ONE, Vol 7, Iss 4, p e36070 (2012)
institution DOAJ
collection DOAJ
language EN
topic Medicine
R
Science
Q
spellingShingle Medicine
R
Science
Q
Sarah Jessen
Jonas Obleser
Sonja A Kotz
How bodies and voices interact in early emotion perception.
description Successful social communication draws strongly on the correct interpretation of others' body and vocal expressions. Both can provide emotional information and often occur simultaneously. Yet their interplay has hardly been studied. Using electroencephalography, we investigated the temporal development underlying their neural interaction in auditory and visual perception. In particular, we tested whether this interaction qualifies as true integration following multisensory integration principles such as inverse effectiveness. Emotional vocalizations were embedded in either low or high levels of noise and presented with or without video clips of matching emotional body expressions. In both, high and low noise conditions, a reduction in auditory N100 amplitude was observed for audiovisual stimuli. However, only under high noise, the N100 peaked earlier in the audiovisual than the auditory condition, suggesting facilitatory effects as predicted by the inverse effectiveness principle. Similarly, we observed earlier N100 peaks in response to emotional compared to neutral audiovisual stimuli. This was not the case in the unimodal auditory condition. Furthermore, suppression of beta-band oscillations (15-25 Hz) primarily reflecting biological motion perception was modulated 200-400 ms after the vocalization. While larger differences in suppression between audiovisual and audio stimuli in high compared to low noise levels were found for emotional stimuli, no such difference was observed for neutral stimuli. This observation is in accordance with the inverse effectiveness principle and suggests a modulation of integration by emotional content. Overall, results show that ecologically valid, complex stimuli such as joined body and vocal expressions are effectively integrated very early in processing.
format article
author Sarah Jessen
Jonas Obleser
Sonja A Kotz
author_facet Sarah Jessen
Jonas Obleser
Sonja A Kotz
author_sort Sarah Jessen
title How bodies and voices interact in early emotion perception.
title_short How bodies and voices interact in early emotion perception.
title_full How bodies and voices interact in early emotion perception.
title_fullStr How bodies and voices interact in early emotion perception.
title_full_unstemmed How bodies and voices interact in early emotion perception.
title_sort how bodies and voices interact in early emotion perception.
publisher Public Library of Science (PLoS)
publishDate 2012
url https://doaj.org/article/cd986d4d47fd408fb55a2837bcebf9f8
work_keys_str_mv AT sarahjessen howbodiesandvoicesinteractinearlyemotionperception
AT jonasobleser howbodiesandvoicesinteractinearlyemotionperception
AT sonjaakotz howbodiesandvoicesinteractinearlyemotionperception
_version_ 1718423626829529088