Emotional cues during simultaneous face and voice processing: electrophysiological insights.
Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and...
Guardado en:
Autores principales: | , , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Public Library of Science (PLoS)
2012
|
Materias: | |
Acceso en línea: | https://doaj.org/article/d75781aa3e0d44d1a5a5b36f941e7b80 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:d75781aa3e0d44d1a5a5b36f941e7b80 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:d75781aa3e0d44d1a5a5b36f941e7b802021-11-18T07:27:08ZEmotional cues during simultaneous face and voice processing: electrophysiological insights.1932-620310.1371/journal.pone.0031001https://doaj.org/article/d75781aa3e0d44d1a5a5b36f941e7b802012-01-01T00:00:00Zhttps://www.ncbi.nlm.nih.gov/pmc/articles/pmid/22383987/pdf/?tool=EBIhttps://doaj.org/toc/1932-6203Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region.Taosheng LiuAna PinheiroZhongxin ZhaoPaul G NestorRobert W McCarleyMargaret A NiznikiewiczPublic Library of Science (PLoS)articleMedicineRScienceQENPLoS ONE, Vol 7, Iss 2, p e31001 (2012) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Medicine R Science Q |
spellingShingle |
Medicine R Science Q Taosheng Liu Ana Pinheiro Zhongxin Zhao Paul G Nestor Robert W McCarley Margaret A Niznikiewicz Emotional cues during simultaneous face and voice processing: electrophysiological insights. |
description |
Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region. |
format |
article |
author |
Taosheng Liu Ana Pinheiro Zhongxin Zhao Paul G Nestor Robert W McCarley Margaret A Niznikiewicz |
author_facet |
Taosheng Liu Ana Pinheiro Zhongxin Zhao Paul G Nestor Robert W McCarley Margaret A Niznikiewicz |
author_sort |
Taosheng Liu |
title |
Emotional cues during simultaneous face and voice processing: electrophysiological insights. |
title_short |
Emotional cues during simultaneous face and voice processing: electrophysiological insights. |
title_full |
Emotional cues during simultaneous face and voice processing: electrophysiological insights. |
title_fullStr |
Emotional cues during simultaneous face and voice processing: electrophysiological insights. |
title_full_unstemmed |
Emotional cues during simultaneous face and voice processing: electrophysiological insights. |
title_sort |
emotional cues during simultaneous face and voice processing: electrophysiological insights. |
publisher |
Public Library of Science (PLoS) |
publishDate |
2012 |
url |
https://doaj.org/article/d75781aa3e0d44d1a5a5b36f941e7b80 |
work_keys_str_mv |
AT taoshengliu emotionalcuesduringsimultaneousfaceandvoiceprocessingelectrophysiologicalinsights AT anapinheiro emotionalcuesduringsimultaneousfaceandvoiceprocessingelectrophysiologicalinsights AT zhongxinzhao emotionalcuesduringsimultaneousfaceandvoiceprocessingelectrophysiologicalinsights AT paulgnestor emotionalcuesduringsimultaneousfaceandvoiceprocessingelectrophysiologicalinsights AT robertwmccarley emotionalcuesduringsimultaneousfaceandvoiceprocessingelectrophysiologicalinsights AT margaretaniznikiewicz emotionalcuesduringsimultaneousfaceandvoiceprocessingelectrophysiologicalinsights |
_version_ |
1718423440612917248 |