Extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task.

The effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Ali Sengül, Michiel van Elk, Giulio Rognini, Jane Elizabeth Aspell, Hannes Bleuler, Olaf Blanke
Formato: article
Lenguaje:EN
Publicado: Public Library of Science (PLoS) 2012
Materias:
R
Q
Acceso en línea:https://doaj.org/article/a9e4e3e87b724b4182ca0c411e5bf923
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:a9e4e3e87b724b4182ca0c411e5bf923
record_format dspace
spelling oai:doaj.org-article:a9e4e3e87b724b4182ca0c411e5bf9232021-11-18T08:06:23ZExtending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task.1932-620310.1371/journal.pone.0049473https://doaj.org/article/a9e4e3e87b724b4182ca0c411e5bf9232012-01-01T00:00:00Zhttps://www.ncbi.nlm.nih.gov/pmc/articles/pmid/23227142/pdf/?tool=EBIhttps://doaj.org/toc/1932-6203The effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly surrounding the body. However, it remains unknown to what extent similar mechanisms apply to the use of virtual-robotic tools, such as those used in the field of surgical robotics, in which a surgeon may use bimanual haptic interfaces to control a surgery robot at a remote location. This paper presents two experiments in which participants used a haptic handle, originally designed for a commercial surgery robot, to control a virtual tool. The integration of multisensory information related to the virtual-robotic tool was assessed by means of the crossmodal congruency task, in which subjects responded to tactile vibrations applied to their fingers while ignoring visual distractors superimposed on the tip of the virtual-robotic tool. Our results show that active virtual-robotic tool use changes the spatial modulation of the crossmodal congruency effects, comparable to changes in the representation of peripersonal space observed during real-world tool use. Moreover, when the virtual-robotic tools were held in a crossed position, the visual distractors interfered strongly with tactile stimuli that was connected with the hand via the tool, reflecting a remapping of peripersonal space. Such remapping was not only observed when the virtual-robotic tools were actively used (Experiment 1), but also when passively held the tools (Experiment 2). The present study extends earlier findings on the extension of peripersonal space from physical and pointing tools to virtual-robotic tools using techniques from haptics and virtual reality. We discuss our data with respect to learning and human factors in the field of surgical robotics and discuss the use of new technologies in the field of cognitive neuroscience.Ali SengülMichiel van ElkGiulio RogniniJane Elizabeth AspellHannes BleulerOlaf BlankePublic Library of Science (PLoS)articleMedicineRScienceQENPLoS ONE, Vol 7, Iss 12, p e49473 (2012)
institution DOAJ
collection DOAJ
language EN
topic Medicine
R
Science
Q
spellingShingle Medicine
R
Science
Q
Ali Sengül
Michiel van Elk
Giulio Rognini
Jane Elizabeth Aspell
Hannes Bleuler
Olaf Blanke
Extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task.
description The effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly surrounding the body. However, it remains unknown to what extent similar mechanisms apply to the use of virtual-robotic tools, such as those used in the field of surgical robotics, in which a surgeon may use bimanual haptic interfaces to control a surgery robot at a remote location. This paper presents two experiments in which participants used a haptic handle, originally designed for a commercial surgery robot, to control a virtual tool. The integration of multisensory information related to the virtual-robotic tool was assessed by means of the crossmodal congruency task, in which subjects responded to tactile vibrations applied to their fingers while ignoring visual distractors superimposed on the tip of the virtual-robotic tool. Our results show that active virtual-robotic tool use changes the spatial modulation of the crossmodal congruency effects, comparable to changes in the representation of peripersonal space observed during real-world tool use. Moreover, when the virtual-robotic tools were held in a crossed position, the visual distractors interfered strongly with tactile stimuli that was connected with the hand via the tool, reflecting a remapping of peripersonal space. Such remapping was not only observed when the virtual-robotic tools were actively used (Experiment 1), but also when passively held the tools (Experiment 2). The present study extends earlier findings on the extension of peripersonal space from physical and pointing tools to virtual-robotic tools using techniques from haptics and virtual reality. We discuss our data with respect to learning and human factors in the field of surgical robotics and discuss the use of new technologies in the field of cognitive neuroscience.
format article
author Ali Sengül
Michiel van Elk
Giulio Rognini
Jane Elizabeth Aspell
Hannes Bleuler
Olaf Blanke
author_facet Ali Sengül
Michiel van Elk
Giulio Rognini
Jane Elizabeth Aspell
Hannes Bleuler
Olaf Blanke
author_sort Ali Sengül
title Extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task.
title_short Extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task.
title_full Extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task.
title_fullStr Extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task.
title_full_unstemmed Extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task.
title_sort extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task.
publisher Public Library of Science (PLoS)
publishDate 2012
url https://doaj.org/article/a9e4e3e87b724b4182ca0c411e5bf923
work_keys_str_mv AT alisengul extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask
AT michielvanelk extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask
AT giuliorognini extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask
AT janeelizabethaspell extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask
AT hannesbleuler extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask
AT olafblanke extendingthebodytovirtualtoolsusingaroboticsurgicalinterfaceevidencefromthecrossmodalcongruencytask
_version_ 1718422250395271168