Multisensory visuo-tactile context learning enhances the guidance of unisensory visual search
Abstract Does multisensory distractor-target context learning enhance visual search over and above unisensory learning? To address this, we had participants perform a visual search task under both uni- and multisensory conditions. Search arrays consisted of one Gabor target that differed from three...
Guardado en:
Autores principales: | Siyi Chen, Zhuanghua Shi, Hermann J. Müller, Thomas Geyer |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Nature Portfolio
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/4bb5e9f8834c4b8fa04fd4602b3a5b7e |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Behavioral impact of unisensory and multisensory audio-tactile events: pros and cons for interlimb coordination in juggling.
por: Gregory Zelic, et al.
Publicado: (2012) -
Interaction of perceptual grouping and crossmodal temporal capture in tactile apparent-motion.
por: Lihan Chen, et al.
Publicado: (2011) -
Insights on embodiment induced by visuo-tactile stimulation during robotic telepresence
por: D. Farizon, et al.
Publicado: (2021) -
Virtual reality alters cortical oscillations related to visuo-tactile integration during rubber hand illusion
por: Noriaki Kanayama, et al.
Publicado: (2021) -
Peripersonal space in the front, rear, left and right directions for audio-tactile multisensory integration
por: Yusuke Matsuda, et al.
Publicado: (2021)