Files

Abstract

Although there is increasing knowledge about how visual and tactile cues from the hands are integrated, little is known about how self-generated hand movements affect such multisensory integration. Visuo-tactile integration often occurs under highly dynamic conditions requiring sensorimotor updating. Here, we quantified visuo-tactile integration by measuring cross-modal congruency effects (CCEs) in different bimanual hand movement conditions with the use of a robotic platform. We found that classical CCEs also occurred during bimanual self-generated hand movements, and that such movements lowered the magnitude of visuo-tactile CCEs as compared to static conditions. Visuo-tactile integration, body ownership and the sense of agency were decreased by adding a temporal visuo-motor delay between hand movements and visual feedback. These data show that visual stimuli interfere less with the perception of tactile stimuli during movement than during static conditions, especially when decoupled from predictive motor information. The results suggest that current models of visuo-tactile integration need to be extended to account for multisensory integration in dynamic conditions.

Details

PDF