How does the human brain organize language? This is what scientists seek to answer as they form a “semantic atlas” showing how areas of the brain respond to words with similar meanings. Surprisingly, results also challenge prevailing beliefs that language is limited to a few brain regions.
Researchers from UC Berkeley fleshed out a brain imaging study that recorded neural activity while volunteers listened to the U.S. radio show “Moth Radio Hour.” Listening to over two hours of stories from the program while remaining still inside an MRI scanner offered telling details about how the brain represents the meaning of spoken language — from one word to another. Language maps turned out to be quite similar across different people. “The similarity in semantic topography across different subjects is really surprising,” said neuroscience researcher and study lead author Alex Huth.
The researchers collected data on blood flow and oxygenation changes — which indicate activity — in various areas of the cerebral cortex, the outer layer of the brain that is instrumental in functions such as language and consciousness. The data were then studied against time-coded records of the stories, with an algorithm scoring words based on how closely related they are in meaning.
The results were turned into a thesaurus-like map, where the words were ordered on the left and right brain hemispheres. Not astonishingly, they revealed that many human brain areas represent language that describes people and social relations, not abstract ideas.
The same word, though, could be repeated several times on various sections of the brain map. Take, for instance, the word “top”: it was represented in the brain area responding to words related to clothing and appearance, as well as in a location dealing with measurement and numbers.
According to Huth, current semantic models expertly predict responses to language in varying “big swaths of cortex,” although fine-grained details also show the kind of information represented in every brain area.
The semantic maps are “broadly consistent” across different persons, added senior author and neuroscientist Jack Gallant, but there are significant individual differences that they seek to probe in a bigger and more diverse study sample.
These detailed maps would hopefully offer a voice to those unable to speak such as stroke, brain damage, or motor neuron illness patients including people with ALS. Mind-reading technology may still be in the distant future, but scientists are now closer to decoding inner thoughts through a look at how the brain organizes language.
Clinicians, for example, could monitor communication patients’ brain activity and match the data to the semantic maps to figure out what they are trying to say but cannot articulate.