When two sensory inputs – a sight and a sound – occur closely in time, the brain can “bind” them and conclude that they result from a single event (i.e., that they are synchronous). The brain’s ability to do this depends on the amount of time between the two stimuli – a temporal window within which stimuli may be perceived as synchronous.
Previously, medical student Albert Powers III, Mark Wallace, professor of hearing and speech sciences, and colleagues showed that the window could be significantly narrowed by a computer-based perceptual training program.
In the May 2 Journal of Neuroscience, the investigators provide the first look at the brain regions involved in these changes. They found that the posterior superior temporal sulcus (where visual and auditory information comes together) and parts of auditory and visual cortex showed decreased activity on fMRI after training. Additionally, the perceptual training program enhanced the coupling among these regions.
Because abnormalities in multisensory processing may contribute to disorders such as autism, dyslexia and schizophrenia, this information could inform diagnostic strategies and interventions for these disabilities.
The research was supported by the Vanderbilt Kennedy Center and grants from National Institute for Child Health and Development and the National Institute for Deafness and Communication Disorders of the National Institutes of Health.