The brain regions that have been recognized as a center where words are decoded also are important in interpreting gestures, according to new research funded by the National Institute on Deafness and Other Communication Disorders (NIDCD). The findings suggest these regions may play a broader role in interpreting symbols than previously thought [Proceedings of the National Academy of Sciences, 106 (49): 20664-69]
"In babies the ability to communicate through gestures precedes spoken language, and you can predict a child's language skills based on the repertoire of his or her gestures during those early months," said NIDCD director James Battey, Jr., MD, PhD. "These findings not only provide compelling evidence regarding where language may have come from, they help explain the interplay that exists between language and gesture as children develop language skills."
Scientists have known that sign language is largely processed in the same regions of the brain as spoken language. These regions include the inferior frontal gyrus (Broca's area) in the front left side of the brain and the posterior temporal region (Wernicke's area) toward the back left side of the brain. Signed and spoken languages activate the same brain regions because they operate in the same way, with their own vocabulary and rules of grammar.
NIDCD researchers collaborated with scientists from Hofstra University School of Medicine, in Hempstead, NY, and San Diego State University, in San Diego, CA, to find out if non-language-related gestures are processed in the same brain regions as language. These hand and body movements are used to convey meaning on their own, without having to be translated into specific words or phrases.
Two types of gestures were considered for the study: pantomimes and emblems. Pantomimes mimic objects or actions, such as unscrewing a jar or juggling balls. Emblems, commonly used in social interactions, signify abstract, usually more emotionally charged concepts than pantomimes. Examples include a hand sweeping across the forehead to indicate that it's hot or a finger to the lips to signify the need to be quiet.
The study involved 20 healthy, English-speaking volunteers. Nine men and 11 women underwent functional magnetic resonance imaging (fMRI) while they watched video clips of a person acting out one of the gesture types or voicing the phrases that the gestures represent. A control group watched clips of a person using meaningless gestures or speaking pseudo-words that had been chopped up and randomly reorganized so the brain would not interpret them as language.
The participants watched 60 video clips for each of the six stimuli, with the clips presented in 45-second time blocks at a rate of 15 clips per block. A mirror attached to the head enabled each person to watch a video projected on the scanner room wall. The scientists then measured brain activity for each of the stimuli and looked for similarities and differences as well as any communication occurring between individual parts of the brain.
The researchers found that the brain was highly activated in the inferior frontal and posterior temporal areas for the gesture and spoken language stimuli. "If gesture and language were not processed by the same system, you'd have spoken language activating the inferior frontal and posterior temporal areas and gestures activating other parts of the brain, but we found virtual overlap," said senior author Allen Braun, MD.
Current thinking in the study of language is that the posterior temporal region serves as a storehouse of words from which the inferior frontal gyrus selects the most appropriate match, like an online search engine that pops up the most suitable Web site at the top of the search results. Rather than being limited to deciphering words alone, the researchers suggested, these regions may be able to apply meaning to any incoming symbols-words, gestures, images, sounds or objects.
These regions also may present a clue into how language evolved, Dr. Braun said. "Our results fit a longstanding theory that says the common ancestor of humans and apes communicated through meaningful gestures, and over time the brain regions that processed gestures became adapted for using words. If the theory is correct, our language areas may actually be the remnant of this ancient communication system-one that continues to process gestures as well as language in the human brain."
Developing a better understanding of the brain systems that support gestures and words may help in the treatment of some patients with aphasia, he added.