The gestures we see can change what we hear
People often use beat gestures, such as moving their hands up and down, to emphasize important words as they speak. But do these gestures have an influence on how well we understand what a speaker is saying?
Researchers at the Max Planck Institute for Psycholinguistics have found that what we see does, in fact, shape what we hear.
“In face-to-face communication, language entails much more than just speech,” explained study senior investigator Hans Rutger Bosker. “Speakers make use of different channels (mouth, hands, and face) to get a message across. We want to understand how listeners make use of these different streams of information when they are listening to someone.”
The McGurk effect is a phenomenon that suggests there is an interaction between hearing and vision in speech perception. This type of illusion takes place when one sound is paired with the visual component of another sound, which tricks our auditory perception.
The researchers set out to investigate whether what we hear may depend on the beat gestures we see. The study was focused on a set of Dutch words that differ only in stress pattern.
For example, the word “PLAto,” with stress on the first syllable, refers to the ancient Greek philosopher. But when there is stress on the second syllable, the word becomes “plaTO,” or plateau.
The experts designed an experiment to test whether beat gestures used at the first or second syllable would change how people interpreted what they heard.
The study revealed that listeners were more likely to hear stress on a syllable if there was a beat gesture on that syllable. The effects were the same for both words and non-words. Beat gestures were even found to influence when people perceived a vowel as being long or short.
“Listeners listen not only with their ears, but also with their eyes,” said Bosker. “These findings are the first to show that beat gestures influence which speech sounds you hear.”
According to the researchers, the effect of beat gestures is probably even stronger in real life, when speech is less clear than it is in the lab. In noisy settings, people may rely more on visual beat gestures for successful communication.
“Our findings also have the potential to enrich human-computer interaction and improve multimodal speech recognition systems. It seems clear that such systems should take into account more than just speech,” said Bosker.
“We will follow up on the study by using virtual reality to test how specific these effects are – are they induced by beat gestures only or also by other types of communicative cues, such as head nods and eyebrow movements?”
The study is published in the journal Proceedings of the Royal Society B.
Find more related articlesScience Category