Read my lips

Why do mismatched speech and mouth movements result in a completely different message? An algorithm may solve the puzzle


Scientists have come a step closer to unravelling the mystery of why a different message is received by the brain when visual and auditory speech is muddled.

New research, published in PLOS Computational Biology, delves into the mechanism behind this illusion, which is known as the McGurk effect.

Baylor College of Medicine professor of neurosurgery, Dr Michael Beauchamp, told OT that normally the brain effortlessly selects verbal and nonverbal speech from conversation partners.

“All humans grow up listening to tens of thousands of speech examples, with the result that our brains contain a comprehensive mapping of the likelihood that any given pair of mouth movements and speech sounds go together,” he added.

The McGurk effect occurs when ability to pair auditory and visual cues goes awry. Mouth movements override the sounds that are heard, causing a person to perceive a different message to what is spoken.

The listener is only able to hear the correct message when their eyes are closed.

Researchers created an algorithm model of multisensory speech perception based on the principle of causal interference.

Dr Beauchamp said the model was able to predict how study participants would integrate auditory and visual speech syllables.

"Understanding how the brain combines information from multiple senses will provide insight into ways to improve declines in speech perception due to typical aging,” he concluded.