UC Berkeley neuroscientists have created interactive maps that can predict where different categories of words activate the brain. Their latest map is focused on what happens in the brain when you read stories.
Opening the door to inner narratives
The findings, appearing in the Journal of Neuroscience, provide further evidence that different people share similar semantic — or word-meaning — topography, opening yet another door to our inner thoughts and narratives. They also have practical implications for learning and for speech disorders, from dyslexia to aphasia.
“At a time when more people are absorbing information via audiobooks, podcasts and even audio texts, our study shows that, whether they’re listening to or reading the same materials, they are processing semantic information similarly,” said study lead author Fatma Deniz, a postdoctoral researcher in neuroscience in the Gallant Lab at UC Berkeley and former fellow with the Berkeley Institute for Data Science.
For this latest brain mapping study, people listened to stories from “The Moth Radio Hour,” a popular podcast series, and then read those same stories. Using functional MRI, researchers scanned their brains in both the listening and reading conditions, compared their listening-versus-reading brain activity data, and found the maps they created from both datasets were virtually identical.
The results can be viewed in an interactive, 3D, color-coded map, where words — grouped in such categories as visual, tactile, numeric, locational, violent, mental, emotional and social — are presented like vibrant butterflies on flattened cortices. The cortex is the coiled surface layer of gray matter of the cerebrum that coordinates sensory and motor information.
The interactive 3D brain viewer is scheduled to go online this week.
As for clinical applications, the maps could be used to compare language processing in healthy people and in those with stroke, epilepsy and brain injuries that impair speech. Understanding such differences can aid recovery efforts, Deniz said.
Decoding the dyslexic brain
The semantic maps can also inform interventions for dyslexia, a widespread, neurodevelopmental language-processing disorder that impairs reading.
“If, in the future, we find that the dyslexic brain has rich semantic language representation when listening to an audiobook or other recording, that could bring more audio materials into the classroom,” Deniz said.
And the same goes for auditory processing disorders, in which people cannot distinguish the sounds or “phonemes” that make up words. “It would be very helpful to be able to compare the listening and reading semantic maps for people with auditory processing disorder,” she said.
Nine volunteers each spent a couple of hours inside functional MRI scanners, listening and then reading stories from “The Moth Radio Hour” as researchers measured their cerebral blood flow.
Their brain activity data, in both conditions, were then matched against time-coded transcriptions of the stories, the results of which were fed into a computer program that scores words according to their relationship to one another.
The mind’s thesaurus
Using statistical modeling, researchers arranged thousands of words on maps according to their semantic relationships. Under the animals category, for example, one can find the words “bear,” “cat” and “fish.”
The maps, which covered at least one-third of the cerebral cortex, enabled the researchers to predict with accuracy which words would activate which parts of the brain.
The results of the reading experiment came as a surprise to Deniz, who had anticipated some changes in the way readers versus listeners would process semantic information.
“We knew that a few brain regions were activated similarly when you hear a word and read the same word, but I was not expecting such strong similarities in the meaning representation across a large network of brain regions in both these sensory modalities,” Deniz said.
Her study is a follow-up to a 2016 Gallant Lab study that recorded the brain activity of seven study subjects as they listened to stories from “The Moth Radio Hour.” That online brain viewer can be found here.
Future mapping of semantic information will include experiments with people who speak languages other than English, as well as with people who have language-based learning disorders, Deniz said.
A child who struggles with listening and auditory processing, may develop poor listening habits and skills. He may not expect to hear, understand or retain words, language patterns, and sentences. A child who doesn’t predict he will be successful often doesn’t pay attention during group activities or to general directions. In addition to missing out on valuable and useful language information, the child misses out on the satisfaction of successful listening. If children don’t predict they will be successful, they aren’t motivated to listen.
StepUp to Learn helps children develop fluency in reading decoding, handwriting, and math-fact retrieval. When these basic skills are not fluent, the greater demands on a child’s working memory may lead to poor academic performance or behavior problems. StepUp's cloud-based programs enrich any PreK - Grade 2 curriculum and can be used as an intervention for struggling learners. Try it free for 30 days!
Article written by Yasmin Anwar and reposted from UC Berkeley.