Article image
10-29-2021

Dogs and infants recognize words using similar brain regions

When human infants learn a language, they look for regularities in the patterns of syllable use within a continuous stream of speech. This technique enables them to identify probable word boundaries; syllables that usually occur together are probably words, and those that do not probably aren’t. Keeping track of patterns in this manner, and learning from the regularities, is known as statistical learning, and it involves the use of complex computations.

A new study by Hungarian researchers has found the first instance of statistical learning of linguistic stimuli in dogs. After only a short exposure (8.5 mins) to continuous speech sounds (made up of meaningless words), the dogs in this study had already succeeded in segmenting the sounds into probable words, using neural and computational mechanisms similar to those used by human infants.  

In order to identify and learn probable words from continuous speech, dogs have to keep track of the probability that two different syllables will occur together. This requires complex computation and is the same mechanism by which infants learn to recognize individual words, even before they know the meanings of these words. 

“This is exactly how humans, even 8-month-old infants, solve the seemingly difficult task of word segmentation: they calculate complex statistics about the probability of one syllable following the other,” explained Marianna Boros, one of the lead authors of the study, and a postdoctoral researcher at the Neuroethology of Communication Lab, Department of Ethology, Eötvös Loránd University.

“Until now we did not know if any other mammal can also use such complex computations to extract words from speech. We decided to test family dogs’ brain capacities for statistical learning from speech. Dogs are the earliest domesticated animal species and probably the one we speak most often to. Still, we know very little about the neural processes underlying their word learning capacities.”

Study co-lead author Lilla Magyari is a postdoctoral researcher in the same research group, who had laid the methodological foundations for performing non-invasive electrophysiology on awake, untrained, cooperating dogs. 

“To find out what kind of statistics dogs calculate when they listen to speech, first we measured their electric brain activity using EEG,” said Magyari.

“Interestingly, we saw differences in dogs’ brain waves for frequent compared to rare words. But even more surprisingly, we also saw brain wave differences for syllables that always occurred together compared to syllables that only occasionally did, even if total frequencies were the same.”

“So, it turns out that dogs keep track not only of simple statistics (the number of times a word occurs) but also of complex statistics (the probability that a word’s syllables occur together). This has never been seen in other non-human mammals before. It is exactly the kind of complex statistics human infants use to extract words from continuous speech.”

To explore whether dogs use the same brain regions as human infants when computing the statistical patterns in continuous speech, the researchers tested the dogs using functional MRI. This technique measures brain activity by detecting changes in blood flow to different regions of the brain. 

The dogs were trained to lie motionless for the time the measurements were taken. All of the dogs were awake, cooperative and unrestrained during the testing. 

“We know that in humans both general learning-related and language-related brain regions participate in this process. And we found the same duality in dogs,” explained Boros. “Both a generalist and a specialist brain region seemed to be involved in statistical learning from speech, but the activation patterns were different in the two.”

“The generalist brain region, the so-called basal ganglia, responded stronger to a random speech stream (where no words could be spotted using syllable statistics) than to a structured speech stream (where words were easy to spot just by computing syllable statistics). The specialist brain region, the so-called auditory cortex, that in humans plays a key role in statistical learning from speech, showed a different pattern: here we saw brain activity increase over time for the structured but not for the random speech stream. We believe that this activity increase is the trace word learning leaves on the auditory cortex,” said Boros.

“We now begin to understand that some computational and neural processes that are known to be instrumental for human language acquisition may not be unique to humans after all,” said Attila Andics, principal investigator of the Neuroethology of Communication Lab. 

“But we still don’t know how these human-analogue brain mechanisms for word learning emerged in dogs. Do they reflect skills that developed by living in a language-rich environment, or during the thousands of years of domestication, or do they represent an ancient mammalian capacity?”

The researchers conclude that by studying speech processing in different breeds of dogs, or even other species that live in close contact with humans, it may be possible to trace the origins of human specializations for speech perception. 

The results of the study are published today in the journal Current Biology.

By Alison Bosman, Earth.com Staff Writer

News coming your way
The biggest news about our planet delivered to you each day
Subscribe