Behavioural, modeling, and electrophysiological evidence for supramodality in human metacognition

Human metacognition, or the capacity to introspect on one's own mental states, has been mostly characterized through confidence reports in visual tasks. A pressing question is to what extent results from visual studies generalize to other domains. Answering this question allows determining whether metacognition operates through shared, supramodal mechanisms, or through idiosyncratic, modality-specific mechanisms. Here, we report three new lines of evidence for decisional and post-decisional mechanisms arguing for the supramodality of metacognition. First, metacognitive efficiency correlated between auditory, tactile, visual, and audiovisual tasks. Second, confidence in an audiovisual task was best modeled using supramodal formats based on integrated representations of auditory and visual signals. Third, confidence in correct responses involved similar electrophysiological markers for visual and audiovisual tasks that are associated with motor preparation preceding the perceptual judgment. We conclude that the supramodality of metacognition relies on supramodal confidence estimates and decisional signals that are shared across sensory modalities.

Published in:
The Journal of Neuroscience, 38, 2, 263-277

 Record created 2017-09-22, last modified 2019-04-18

Rate this document:

Rate this document:
(Not yet reviewed)