Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.

A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an in...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Daniel S Pages, Jennifer M Groh
Formato: article
Lenguaje:EN
Publicado: Public Library of Science (PLoS) 2013
Materias:
R
Q
Acceso en línea:https://doaj.org/article/8695e19a37ec4012b8f7dffa222ea410
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:8695e19a37ec4012b8f7dffa222ea410
record_format dspace
spelling oai:doaj.org-article:8695e19a37ec4012b8f7dffa222ea4102021-11-18T08:57:45ZLooking at the ventriloquist: visual outcome of eye movements calibrates sound localization.1932-620310.1371/journal.pone.0072562https://doaj.org/article/8695e19a37ec4012b8f7dffa222ea4102013-01-01T00:00:00Zhttps://www.ncbi.nlm.nih.gov/pmc/articles/pmid/24009691/?tool=EBIhttps://doaj.org/toc/1932-6203A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a 'guess and check' heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain's reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.Daniel S PagesJennifer M GrohPublic Library of Science (PLoS)articleMedicineRScienceQENPLoS ONE, Vol 8, Iss 8, p e72562 (2013)
institution DOAJ
collection DOAJ
language EN
topic Medicine
R
Science
Q
spellingShingle Medicine
R
Science
Q
Daniel S Pages
Jennifer M Groh
Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
description A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a 'guess and check' heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain's reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.
format article
author Daniel S Pages
Jennifer M Groh
author_facet Daniel S Pages
Jennifer M Groh
author_sort Daniel S Pages
title Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
title_short Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
title_full Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
title_fullStr Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
title_full_unstemmed Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
title_sort looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
publisher Public Library of Science (PLoS)
publishDate 2013
url https://doaj.org/article/8695e19a37ec4012b8f7dffa222ea410
work_keys_str_mv AT danielspages lookingattheventriloquistvisualoutcomeofeyemovementscalibratessoundlocalization
AT jennifermgroh lookingattheventriloquistvisualoutcomeofeyemovementscalibratessoundlocalization
_version_ 1718421094327648256