The brain uses supramodality, a single mechanism to evaluate our confidence in various senses like audition, touch, or vision according to a study published in the Journal of Neuroscience.
Metacognition is a word often used by behavioural scientists and psychologists to describe high level cognitive skills like “thinking about thinking,” “knowing about knowing” and “being aware about being aware.” These broadly refer to our capacity to reach, report and regulate our own mental states.
Specifically, metacognition makes it possible for our brain to estimate our level of confidence in sensory inputs from the external world like sound, light, or touch. Our response to these stimuli hinges largely on how accurate these estimates are and can be crucial in daily life, for instance when hearing a baby crying, or smelling a gas leak. The brain evaluates confidence after computing inputs from multiple senses simultaneously.
Neuroscientists have for long debated on the way metacognition works in different senses: Does it use the same rules for visual, auditory, or tactile stimuli, or does it use different components? The former i.e. the “common rules,” is known as “supramodality.”
The recent study based on a series of experiments by Olaf Blanke’s lab at EPFL has now conclusively settled the case in favour of supramodality.
For more read: A Brain System That Builds Confidence in What We See, Hear and Touch