Multisensory interactions facilitate categorical discrimination of objects
Object representations within both the auditory and visual systems include partially segregated networks for living and man-made items. Presently, it is unresolved whether/how multisensory interactions impact access to and discrimination of these representations. Participants in the present study were presented with auditory, visual, or auditory-visual stimuli during a living/non-living discrimination task while 160-channel event-related potentials (ERPs) were recorded. Reaction times were slowest for auditory conditions, but did not differ between multisensory and visual conditions, providing no evidence for multisensory performance enhancement. ERP analyses focused on identifying topographic modulations, because such are forcibly the result of distinct configurations of intracranial brain networks. First, these ERP analyses revealed that topographic ERP differences between object categories occurred ∼40 ms earlier following multisensory (∼200 ms) than either visual (∼240 ms) or auditory (∼340 ms) stimulation. Multisensory interactions therefore facilitate segregated cortical object representations. These analyses also indicated that the earliest non-linear multisensory neural response interactions manifest as topographic ERP modulations, irrespective of object category, at 60ms post-stimulus onset. Auditory-visual multisensory interactions recruit (partially) distinct brain generators from those active under unisensory conditions, corroborating and extending our earlier findings with simple and task-irrelevant stimuli. These results begin to differentiate multiple temporal and functional stages of multisensory auditory-visual interactions.
Record created on 2012-01-17, modified on 2016-08-09