Abstract

Recent advances in brain-machine interfaces (BMIs) have demonstrated the possibility of motor neuroprosthetics directly controlled by brain activity. Ideally neuroprosthetic limbs should be integrated in the body schema of the subject. To explore the ways to enhance such incorporation, we recorded modulations of neuronal ensemble activity in the primary somatosensory (S1) cortex during tactile stimulation simulated in virtual reality (VR) under conditions known to evoke a rubber-hand illusion. A realistic 3D mesh represented monkey body in VR. The monkey’s arms were hidden by an opaque plate and virtual arms projected on the plate. A robotic brush, also hidden from the monkey, touched various locations on forearms of the monkey and was synchronized with a virtual brush touching the projected VR arms. Additionally, we implemented tactile stimulation with air puffs. We have tested various combination of tactile (physical touch), visual (VR arm being touched) and sound (robotic brush touching the arm) inputs: synchronous tactile and visual (T-VR), tactile without visual (T), and visual only (VR). Neuronal ensemble activity was recorded from S1 and primary motor cortex (M1). We found differences in both S1 and M1 activities across the stimulation types. In particular S1 responses to T-VR were stronger than for T. Moreover, S1 neurons were modulated during visual stimulation without touch (VR), suggesting S1 activation as a neuronal mechanism of the rubber-hand illusion. Further, we decoded stimulation parameters from the activity of large neuronal populations. These results suggest a flexible and distributed representation of somatosensory information in the cortex, which can be modified by visual feedback from the body and/or artificial actuators.

Details

Actions