000183070 001__ 183070
000183070 005__ 20190316235542.0
000183070 0247_ $$2doi$$a10.1371/journal.pone.0049473
000183070 022__ $$a1932-6203
000183070 02470 $$2ISI$$a000312588200005
000183070 037__ $$aARTICLE
000183070 245__ $$aExtending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task
000183070 269__ $$a2012
000183070 260__ $$aSan Francisco$$bPublic Library of Science$$c2012
000183070 300__ $$a9
000183070 336__ $$aJournal Articles
000183070 520__ $$aThe effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly surrounding the body. However, it remains unknown to what extent similar mechanisms apply to the use of virtual-robotic tools, such as those used in the field of surgical robotics, in which a surgeon may use bimanual haptic interfaces to control a surgery robot at a remote location. This paper presents two experiments in which participants used a haptic handle, originally designed for a commercial surgery robot, to control a virtual tool. The integration of multisensory information related to the virtual-robotic tool was assessed by means of the crossmodal congruency task, in which subjects responded to tactile vibrations applied to their fingers while ignoring visual distractors superimposed on the tip of the virtual-robotic tool. Our results show that active virtual-robotic tool use changes the spatial modulation of the crossmodal congruency effects, comparable to changes in the representation of peripersonal space observed during real-world tool use. Moreover, when the virtual-robotic tools were held in a crossed position, the visual distractors interfered strongly with tactile stimuli that was connected with the hand via the tool, reflecting a remapping of peripersonal space. Such remapping was not only observed when the virtual-robotic tools were actively used (Experiment 1), but also when passively held the tools (Experiment 2). The present study extends earlier findings on the extension of peripersonal space from physical and pointing tools to virtual-robotic tools using techniques from haptics and virtual reality. We discuss our data with respect to learning and human factors in the field of surgical robotics and discuss the use of new technologies in the field of cognitive neuroscience.
000183070 700__ $$0242148$$aSengül, Ali$$g188789
000183070 700__ $$0244855$$avan Elk, Michiel$$g206826
000183070 700__ $$0242149$$aRognini, Giulio$$g188041
000183070 700__ $$aAspell, Jane Elizabeth
000183070 700__ $$aBleuler, Hannes
000183070 700__ $$0240593$$aBlanke, Olaf$$g165806
000183070 773__ $$j7$$k12$$qe49473$$tPloS one
000183070 8564_ $$s250472$$uhttps://infoscience.epfl.ch/record/183070/files/Seng%C3%BCl%20et%20al._2012.pdf$$yPublisher's version$$zPublisher's version
000183070 909C0 $$0252325$$pLNCO$$xU11025
000183070 909CO $$ooai:infoscience.tind.io:183070$$pSV$$particle$$qGLOBAL_SET
000183070 917Z8 $$x198754
000183070 917Z8 $$x182396
000183070 937__ $$aEPFL-ARTICLE-183070
000183070 973__ $$aEPFL$$rREVIEWED$$sPUBLISHED
000183070 980__ $$aARTICLE