Combining proprioception and touch to compute spatial information
Localising a tactile stimulus in egocentric space involves integrating information from skin receptors with proprioceptive inputs about body posture. We investigated whether body posture automatically influences tactile spatial judgements, even when it is irrelevant to the task. In Experiment 1, participants received two successive tactile stimuli on the forearm and were asked to indicate whether the first or second touch of the pair was closer to an anatomical body landmark, either the wrist or the elbow. The task was administered in three experimental conditions involving different body postures: canonical body posture with extended forearm and hand pointing distally; a non-canonical body posture with forearm and hand pointing vertically up at 90A degrees and a 'reversed' body posture with the elbow fully flexed at 180A degrees, so that the hand pointed proximally. Thus, our task required localising touch on the skin and then relating skin locations to anatomical body landmarks. Critically, both functions are independent of the posture of the body in space. We nevertheless found reliable effects of body posture: judgement errors increased when the canonical forearm posture was rotated through 180A degrees. These results were further confirmed in Experiment 2, in which stimuli were delivered to the finger. However, additionally reversing the canonical posture of the finger, as well as that of the forearm, so that the finger was restored to its canonical orientation in egocentric space, restored performance to normal levels. Our results confirm an automatic process of localising the body in external space underlying the process of tactile perception. This process appears to involve a combination of proprioceptive and tactile information.