Neuroscientists have identified a brain signal that corresponds with speech and understanding. When someone comprehends words that are being spoken, the signal is present. However, the signal is absent when the listener either did not understand what was being said or was not paying attention.
According to the study’s lead author Ed Lalor from Trinity College Dublin, the discovery of this brain signal could be groundbreaking.
“Potential applications include testing language development in infants, or determining the level of brain function in patients in a reduced state of consciousness,” said Professor Lalor.
“The presence or absence of the signal may also confirm if a person in a job that demands precision and speedy reactions – such as an air traffic controller, or soldier – has understood the instructions they have received, and it may perhaps even be useful for testing for the onset of dementia in older people based on their ability to follow a conversation.”
Throughout the day, people routinely communicate around 120 to 200 words per minute. At this rate, listeners must comprehend the meaning of each of these words very rapidly to stay on track with the conversation.
The ability of our brains to compute words so quickly, despite the fact that words can vary greatly depending on their context, has been a mystery to scientists. But now, researchers have demonstrated that our brains rapidly decipher the similarity in meaning that each word has to the words that came just before it.
For their investigation, the researchers analyzed how smartphones and modern computers can “understand” speech. The computers accomplish this level of comprehension by finding similarities between one word and another.
The researchers set out to determine if humans compute the similarity between words as well. They used a technique known as electroencephalography (EEG), which records electrical brainwave signals, as participants listened to a number of audiobooks.
When the experts examined the brain activity that had been captured, they identified a specific brain response that represented how similar or different a given word was from the words that preceded it.
The researchers discovered that the signal completely disappeared when the participants could not understand the speech or when they stopped paying attention. The team concluded that this signal provides a critically important and sensitive representation of whether individuals are truly comprehending the speech they are hearing.
“There is more work to be done before we fully understand the full range of computations that our brains perform when we understand speech,” said Professor Lalor.
“However, we have already begun searching for other ways that our brains might compute meaning, and how those computations differ from those performed by computers. We hope the new approach will make a real difference when applied in some of the ways we envision.”
The study is published in the journal Current Biology.