The language you speak doesn't look like it changes how your brain is organized
Study shows brain 'reads' sentences in the same way across languages. Image: REUTERS/Chris Helgren
Shilo Rea
Director of Media Relations, Dietrich College of Humanities & Social Sciences, Carnegie Mellon UniversityWhen the brain “reads” or decodes a sentence in English or Portuguese, its neural activation patterns are the same, researchers report.
Published in NeuroImage, the study is the first to show that different languages have similar neural signatures for describing events and scenes. By using a machine-learning algorithm, the research team was able to understand the relationship between sentence meaning and brain activation patterns in English and then recognize sentence meaning based on activation patterns in Portuguese.
The findings can be used to improve machine translation, brain decoding across languages, and, potentially, second language instruction.
“This tells us that, for the most part, the language we happen to learn to speak does not change the organization of the brain,” says Marcel Just, professor of psychology at Carnegie Mellon University.
“Semantic information is represented in the same place in the brain and the same pattern of intensities for everyone. Knowing this means that brain to brain or brain to computer interfaces can probably be the same for speakers of all languages,” Just says.
For the study, 15 native Portuguese speakers—eight were bilingual in Portuguese and English—read 60 sentences in Portuguese while in a functional magnetic resonance imaging (fMRI) scanner. A computational model developed at Carnegie Mellon was able to predict which sentences the participants were reading in Portuguese, based only on activation patterns.
The computational model uses a set of 42 concept-level semantic features and six markers of the concepts’ roles in the sentence, such as agent or action, to identify brain activation patterns in English.
With 67 percent accuracy, the model predicted which sentences were read in Portuguese. The resulting brain images showed that the activation patterns for the 60 sentences were in the same brain locations and at similar intensity levels for both English and Portuguese sentences.
Additionally, the results revealed the activation patterns could be grouped into four semantic categories, depending on the sentence’s focus: people, places, actions, and feelings. The groupings were very similar across languages, reinforcing the organization of information in the brain is the same regardless of the language in which it is expressed.
“The cross-language prediction model captured the conceptual gist of the described event or state in the sentences, rather than depending on particular language idiosyncrasies. It demonstrated a meta-language prediction capability from neural signals across people, languages, and bilingual status,” says Ying Yang, a postdoctoral associate in psychology and first author of the study.
Additional coauthors are from Carnegie Mellon and the Federal University of Santa Catarina in Brazil.
The Office of the Director of National Intelligence and the Intelligence Advanced Research Projects Activity via the US Air Force Research Laboratory funded this research.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Human Enhancement
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Wellbeing and Mental HealthSee all
Naoko Tochibayashi and Mizuho Ota
November 7, 2024