Your brain doesn't just tune into speech, it picks up the rhythm of sign language as well
Neuroscientists want to understand how the human brain creates and perceives language. Image: REUTERS/Tami Chappell
The way the brain locks into patterns of speech is not unique to understanding spoken language, a new study suggests. The process, called entrainment, is also part of understanding sign language.
The human brain works in rhythms and cycles. These patterns occur at predictable frequencies that depend on what a person is doing and on what part of the brain is active during the behavior.
Similarly, there are rhythms and patterns out in the world, and for the last 20 years, scientists have been perplexed by the brain’s ability to “entrain,” or match up, with these patterns.
Language is one of those areas in which scientists observe neural entrainment: When people listen to speech, their brain waves lock up with the volume-based rhythms they hear. Since people can’t pay attention to everything happening in their environment at once, this phase locking is thought to help anticipate when important information is likely to appear.
Favorite frequencies
Many studies have documented this phenomenon in language processing; however, it has been difficult to tell whether neural entrainment is specialized for spoken language.
“To determine if neural entrainment to language is specialized for speech or if it is a general-purpose tool that humans can use for anything that is temporally predictable, we had to go outside of speech and outside of auditory perception,” says Geoffrey Brookshire, the study’s lead author and a doctoral student in the psychology department at the University of Chicago.
“By looking at sign, we’re learning something about how the brain processes language more generally. We’re solving a mystery we couldn’t crack by studying speech alone,” says Daniel Casasanto, assistant professor of psychology and leader of the university’s Experience and Cognition Lab.
In speech, the brain locks on to syllables, words, and phrases, and those rhythms occur below 8 Hz, or 8 pulses per second. Vision also has a preferred frequency onto which it latches.
“When we focus on random flashes of light, for example, our brains most enthusiastically lock on to flashes around 10 Hz. By looking at sign language, we can ask whether the important thing for entrainment is which sense you’re using, or the kind of information you’re getting,” Brookshire says.
To determine if people tune into visual rhythms in the same way they tune into the auditory rhythms of language, they showed videos of stories told in American Sign Language to fluent signers and measured brain activity as they watched.
Once the researchers had these electroencephalogram readings, they needed a way to measure visual rhythms in sign language.
While there are well-established methods to measure rhythms in speech, there are no automatic, objective equivalents for the temporal structure of sign language. So the researchers created one.
‘An exciting finding’
They developed a new metric, called the instantaneous visual change, which summarizes the degree of change at each time period during signing. They ran experiment videos, the ones watched by participants, through their new algorithm to identify peaks and valleys in visual changes between frames. The largest peaks were associated with large, quick movements.
With this roadmap illustrating the magnitude of visual changes over time in the videos, Brookshire overlaid the participants’ EEGs to see whether people entrain around the normal visual frequency of about 10 Hz, or at the lower frequencies of signs and phrases in sign language—about 2 Hz.
Participants’ brain waves locked into the specific frequencies of sign language, rather than locking into the higher frequency that vision tends to prefer.
“This is an exciting finding because scientists have been theorizing for years about how adaptable or flexible entrainment may be, but we were never sure if it was specific to auditory processing or if it was more general purpose,” Brookshire says. “This study suggests that humans have the ability to follow perceptual rhythms and make temporal predictions in any of our senses.”
In a broader sense, neuroscientists want to understand how the human brain creates and perceives language, and entrainment has emerged as an important mechanism.
In revealing neural entrainment as a generalized strategy for improving sensitivity to informational peaks, this study takes significant steps toward advancing the understanding of human language and perception.
“The piece of the paper that I find particularly exciting is that it compares how signers and non-signers process American Sign Language stimuli,” says Susan Goldin-Meadow, a professor in the department of psychology and scholar of language and gesture.
“Although both groups showed the same level of entrainment in early visual regions, they displayed differences in frontal regions–this finding sets the stage for us to identify aspects of neural entrainment that are linked to the physical properties of the visual signal compared to aspects that appear only with linguistic knowledge,” she says.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Neuroscience
Related topics:
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Fourth Industrial RevolutionSee all
Daniel Dobrygowski and Bart Valkhof
November 21, 2024