Article image
02-22-2020

How do our brains shift to understand and pay attention to language

The human brain reacts differently to sounds of understandable language than it does to white noise, non-speech sounds, and different languages. But this is old news to neuroscientists. However, what neuroscientists didn’t know up until recently was how quickly the brain shifts to pay attention, process, and understand familiar language sounds.

Now, researchers from the University of Maryland’s School of Medicine have tracked and timed this shift and documented their findings in a paper published in the journal Current Biology.

Christian Brodbeck, a Postdoctoral Researcher from the Institute for Systems Research, L. Elliot Hong of the University of Maryland School of Medicine, and Professor Jonathan Z. Simon, with a triple appointment in the Departments of Biology and Electrical and Computer Engineering as well as ISR, discovered that the brain processes language within a tenth of a second of the ears picking it up.

According to Simon, “When we listen to someone talking, the change in our brain’s processing from not caring what kind of sound it is to recognizing it as a word happens surprisingly early. In fact, this happens pretty much as soon as the linguistic information becomes available.”

The team mapped and examined study participants’ neural brain activity while listening to a single person tell a story using magnetoencephalography (MEG), a non-invasive neuroimaging method that uses sensitive magnetometers to track magnetic fields produced within the brain.

The results showed that the brain’s auditory cortex analyzes acoustic patterns and phonetic sounds that make up syllables within a tenth of a second from these sounds entering the ears. Within that sliver of time, the brain is able to distinguish language sounds from non-language sounds, understand the linguistic message, and keep up with a speaker speaking three words per second by predicting what it’s likely to hear next.

“We usually think that what the brain processes this early must be only at the level of sound, without regard for language,” Simon said. “But if the brain can take knowledge of language into account right away, it would actually process sound more accurately. In our study we see that the brain takes advantage of language processing at the very earliest stage it can.”

The researchers also found that people are able to selectively hear a single speaker within a noisy environment. The brain focuses on one person speaking while blocking out other familiar, conversational language within the same environment — like a cocktail party.

“This may reveal a ‘bottleneck’ in our brains’ speech perception,” Brodbeck said. “We think lexical perception works by our brain considering the match between the incoming speech signal and many different words at the same time. It could be that this mechanism involves mental resources that have limitations on how many different options can be tried simultaneously, making it impossible to attend to more than one speaker at the same time.”

Having mapped the speed of understanding language, the team hopes their findings can lay the groundwork for others to determine how exactly the brain understands what word is being said, and what said word in means in a certain context.

By Olivia Harvey, Earth.com Staff Writer

News coming your way
The biggest news about our planet delivered to you each day
Subscribe