Vestibular and active self-motion signals drive visual perception in binocular rivalry

Summary: Multisensory integration helps the brain build reliable models of the world and resolve ambiguities. Visual interactions with sound and touch are well established but vestibular influences on vision are less well studied. Here, we test the vestibular influence on vision using horizontally o...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: David Alais, Robert Keys, Frans A.J. Verstraten, Chris L.E. Paffen
Formato: article
Lenguaje:EN
Publicado: Elsevier 2021
Materias:
Q
Acceso en línea:https://doaj.org/article/8cd2e76142c24cf1aabb998d428293bf
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:8cd2e76142c24cf1aabb998d428293bf
record_format dspace
spelling oai:doaj.org-article:8cd2e76142c24cf1aabb998d428293bf2021-11-26T04:37:41ZVestibular and active self-motion signals drive visual perception in binocular rivalry2589-004210.1016/j.isci.2021.103417https://doaj.org/article/8cd2e76142c24cf1aabb998d428293bf2021-12-01T00:00:00Zhttp://www.sciencedirect.com/science/article/pii/S2589004221013882https://doaj.org/toc/2589-0042Summary: Multisensory integration helps the brain build reliable models of the world and resolve ambiguities. Visual interactions with sound and touch are well established but vestibular influences on vision are less well studied. Here, we test the vestibular influence on vision using horizontally opposed motions presented one to each eye so that visual perception is unstable and alternates irregularly. Passive, whole-body rotations in the yaw plane stabilized visual alternations, with perceived direction oscillating congruently with rotation (leftward motion during leftward rotation, and vice versa). This demonstrates a purely vestibular signal can resolve ambiguous visual motion and determine visual perception. Active self-rotation following the same sinusoidal profile also entrained vision to the rotation cycle – more strongly and with a lesser time lag, likely because of efference copy and predictive internal models. Both experiments show that visual ambiguity provides an effective paradigm to reveal how vestibular and motor inputs can shape visual perception.David AlaisRobert KeysFrans A.J. VerstratenChris L.E. PaffenElsevierarticleBiological sciencesNeuroscienceSensory neuroscienceScienceQENiScience, Vol 24, Iss 12, Pp 103417- (2021)
institution DOAJ
collection DOAJ
language EN
topic Biological sciences
Neuroscience
Sensory neuroscience
Science
Q
spellingShingle Biological sciences
Neuroscience
Sensory neuroscience
Science
Q
David Alais
Robert Keys
Frans A.J. Verstraten
Chris L.E. Paffen
Vestibular and active self-motion signals drive visual perception in binocular rivalry
description Summary: Multisensory integration helps the brain build reliable models of the world and resolve ambiguities. Visual interactions with sound and touch are well established but vestibular influences on vision are less well studied. Here, we test the vestibular influence on vision using horizontally opposed motions presented one to each eye so that visual perception is unstable and alternates irregularly. Passive, whole-body rotations in the yaw plane stabilized visual alternations, with perceived direction oscillating congruently with rotation (leftward motion during leftward rotation, and vice versa). This demonstrates a purely vestibular signal can resolve ambiguous visual motion and determine visual perception. Active self-rotation following the same sinusoidal profile also entrained vision to the rotation cycle – more strongly and with a lesser time lag, likely because of efference copy and predictive internal models. Both experiments show that visual ambiguity provides an effective paradigm to reveal how vestibular and motor inputs can shape visual perception.
format article
author David Alais
Robert Keys
Frans A.J. Verstraten
Chris L.E. Paffen
author_facet David Alais
Robert Keys
Frans A.J. Verstraten
Chris L.E. Paffen
author_sort David Alais
title Vestibular and active self-motion signals drive visual perception in binocular rivalry
title_short Vestibular and active self-motion signals drive visual perception in binocular rivalry
title_full Vestibular and active self-motion signals drive visual perception in binocular rivalry
title_fullStr Vestibular and active self-motion signals drive visual perception in binocular rivalry
title_full_unstemmed Vestibular and active self-motion signals drive visual perception in binocular rivalry
title_sort vestibular and active self-motion signals drive visual perception in binocular rivalry
publisher Elsevier
publishDate 2021
url https://doaj.org/article/8cd2e76142c24cf1aabb998d428293bf
work_keys_str_mv AT davidalais vestibularandactiveselfmotionsignalsdrivevisualperceptioninbinocularrivalry
AT robertkeys vestibularandactiveselfmotionsignalsdrivevisualperceptioninbinocularrivalry
AT fransajverstraten vestibularandactiveselfmotionsignalsdrivevisualperceptioninbinocularrivalry
AT chrislepaffen vestibularandactiveselfmotionsignalsdrivevisualperceptioninbinocularrivalry
_version_ 1718409853470244864