New Study can the Eyes Play Tricks on the Ears

My mother says that the younger generation is going to hell in a hand basket; they are unable to communicate or understand the world or people around them outside of their own select peer group. As a teacher, there are days when I am inclined to agree with her. I often wonder what it is that they missed in their upbringing what did I have that they didn’t?

Dr. Jennifer Groh’s recent research may shed some light. Located in the midbrain is a little understood area called the inferior colliculus. Smaller than a half dollar in size, this region becomes the switching station for routing information received both visually and auditorily into a believable sequence. Without it, what we see and hear would be a jumbled mess of incoming nonsense.

This area is responsible for creating the effect that enables ventriloquists to create the illusion of throwing their voice. It is the region that allows us to suspend our disbelief that there is a separation between the character on the TV or movie screen and the speakers used to amplify and deliver the sound.

The media and advertising have recently striven to create sensual experiences using many of the triggers that trick the brain into blending incoming information. This gives us, the general population, the sense that “we are there,” experiencing an event even though it is a virtual experience. On the low invasive end of this is the association of “red” with “cherry” or “strawberry” tastes and smells, or “purple” with grape. Big screen examples include shoot outs, earthquakes (shaking cameras), war zones and exploding bombs. Wii, virtual reality computer games that combine sound, sight and physical movement, is another example.

There is a limit to the spatial / time sequencing involved in the brain’s processing. If the event is not synchronized both visually and auditorily, the brain does not meld the visual and auditory experience together. We experience a separation between the visual image and the sounds we hear; the lips don’t move in sync with the words. We experience the sensation of disbelief what the speaker is saying is not what the rest of the picture (the gestures and expressions) is telling us.

So why is all of this important? How does this “inferior colliculus” really affect us? I think that all most everyone experiences the winter blues cold weather, less sunlight; less visual and auditory stimulation. For some, the blues give way to seasonal depression, also known as Seasonal Affective Disorder (SAD). The current therapy for SAD is the use of light boxes in the early morning to stimulate the sense of sunlight which decreases depression symptoms.

According to various sites on SAD, positive effects of light boxes are felt within 2 to 4 days, with symptoms becoming minor within 2 to 4 weeks. Using the information we now have about the inferior colliculus, what would happen if sounds of spring (birds, running water, breezes in the trees) were added to the light box experience? Maybe the depression wouldn’t take as long to remediate (just a thought).

John Keys suggests that a bi-sensory approach to speech, visual perception and auditory perception disabilities would increase the likelihood of successful rehabilitation. It is the process of integrating all of clues including tone of voice, facial expressions, movement of mouth, pitch visual and auditory. It is called multi-sensory cue training and it begins at birth. The importance is that it helps people fill in those spaces where auditory information is missing due to disruptions that make hearing difficult. (5)

How does the younger generation fit into this? Communication for the younger generation includes text messaging, big screen productions, computer sites that answer their questions and ear buds that connect them to a multitude of songs; they have little real world interaction outside of their own peer group.

The “younger generation” has the technology down pat but they are missing real world experience; they are lacking the practice to keep the switching system for sound and sight sharp. With the amount of information that assails them daily, it is quite possible that the inferior colliculus is overwhelmed due to lack of practice; what is coming in is nonsense to them. When they don’t understand something, teenagers simply put it on “pause.”

So what did I have that kids today don’t have? Kids have all the necessary speakers and screens to be visually and auditorily stimulated. In my day, this was called real life.

Sources:
Anderson, Corinne D. “Auditory and Visual Characteristics of Individual Talkers in Multimodial Speech Perception.” The Ohio State University. Published 6.07. Date accessed: 1.6.08. https://kb.osu.edu/dspace/handle/1811/28373

Keyes, John. “Effects and Interactions of Auditory and Visual Cues in Oral Communication”. Eric. Date accessed: 1.6.08. #ED002887. www.eric.gov
Porter, Kristin K; Metzger, Ryan; Groh, Jennifer. “Visual and saccade-related signals in the primate inferior colliculus”; date accessed: 1.7.08 www.pnas.org/cgi/reprint/0706249104v1?maxtoshow=&HITS=10&hits=10&RESULTFORMAT=&fulltext=Groh&searchid=1&FIRSTINDEX=0&sortspec=date&resourcetype=HWCIT

Rahne, Torsten; Bockmann, Martin; von Specht, Helmut; Sussamn, Elyse. “Visual Cues can Modulate Integration and Segregation of Objects in Auditory Scene Analysis”. NIH Public Access, published 1.07. Date accessed: 1.7.08. www.pubmedcentral.nih.gov/articlerender.fegi?artid=1885229

Swaminathan, Nikhil. “When the eyes Play Tricks on the Ears”. Scientific American, 10.29.2007. National Academy of Sciences. Date accessed: 1.6.08. www.pnas.org/cgi/content/full/104/45/17855