New Research Shows Sound Helps Primates See
New research reported on Live Science may change traditional neurological understanding of how humans and other primates perceive and process sensory input.
Results of the research conducted by Ye Wang, Department of Neurobiology and Anatomy, University of Texas-Houston Medical School and Simona Celebrini, Yves Trotter, and Pascal Barone, Centre de Recherche Cerveau & Cognition, Faculté de Médecine de Rangueil, Toulouse, France was recently published in BMC Neuroscience as Visuo-auditory interactions in the primary visual cortex of the behaving monkey. Electrophysiological evidence shows evidence of multisensory interactions that occur to quickly in monkeys to be explained by traditional models of neurological sensory processing.
Traditional neurological models of auditory, tactile and visual sensory input processing by the brain are hierarchal, based on separate input channels from sensory receptors for each mode to each of the primary sensory cortices and that the various inputs are not integrated into polymodality until the signal reaches higher areas of cognitive activity such as the frontal, temporal and parietal lobes. More recently, imaging and EEG studies of humans have shown cross-modal activity directly linking areas of the brain that are involved in processing different sensory modalities leading to a reevaluation of the traditional view of isolated sensory processing until sensory signals reach areas of higher brain function.
The most recent study involved using monkeys trained to locate a spot of light projected on a video screen. The experiment involved comparing results using visual only (V) stimuli with those of visual stimuli coupled with audio stimuli (A/V). While monitoring 45 to 50 individual neurons for electrophysiological response strength and latency in the monkey’s primary visual cortex (V1), the study found that the monkeys could quickly locate a visual stimulus under conditions of high contrast but took longer when the contrast was decreased. When the low contrast spot was coupled with an auditory stimulus, the monkeys showed a significant increase in mean response latency, but showed no improvement in response from A/V input when with high contrast visual input.
Editor's Note: Of course this is akin to saying that if you're in a dim room and looking for your keys, you'll be aided a bit more if they emit a tone. Congratulations to Captain Obvious and his team for another apparent waste of grant money to fund what appears to be very unnecessary research.
Visual responses in V1 to stimuli typically have mean latencies that range from 50-70ms. Audio response times are typically much faster for the primary auditory cortex (AI) at 35ms, and response in the polymodal area STP to audio input is 45ms, also faster than response in V1. The study found an improvement of mean response latency with low contrast V stimuli from 64.5ms to 61.0ms when couple as A/V stimuli. The improved response time supports the hypothesis of direct links between different sensory areas in the brain and multimodal processing can occur before sensory signals reach areas of higher cognitive brain function.
The researchers further discuss these sensory cross connections between low level cortices as a possible explanation for compensation effects in animals that have lost function of a particular sense. As an example, the authors speak of visual cortex stimulation identified in blind human subjects while reading Braille as well as how the blind often have superior hearing and the deaf, superior sight. With the loss of particular sensory inputs, that specific cortex is able to make themselves available to process other sensory information.
So, it’s not so much that we can see sound as suggested in the Live Science headline, it’s more accurate to say that sound can help us to see more quickly.