How the brain can create sound information via lip-reading

How the brain can create sound information via lip-reading


Brain activity synchronizes with sound waves, even without audible sound, through lip-reading, according to new research published in JNeurosci.

Listening to speech activates our auditory cortex to synchronize with the rhythm of incoming sound waves. Lip-reading is a useful aid to comprehend unintelligible speech, but we still don't know how lip-reading helps the brain process sound.

The authors used magnetoencephalography to measure brain activity in 28 healthy (17 females) adults while they listened to a story or watched a silent video of a woman speaking. The participants' auditory cortices synchronized with sound waves produced by the woman in the video, even though they could not hear it.

In video-only, auditory cortical activity entrained to the absent auditory signal at frequencies below 1 Hz more than to the seen lip movements. This entrainment process was characterized by an auditory-speech—to—brain delay of ∼70 ms in the left hemisphere, compared to ∼20 ms in audio-only. Entrainment to mouth opening was found in the right angular gyrus at below 1 Hz, and in early visual cortices at 1—8 Hz.

The synchronization resembled that in those who actually did listen to the story, indicating the brain can glean auditory information from the visual information available to them through lip-reading.

The researchers suggest this ability arises from activity in the visual cortex synchronizing with lip movement. This signal is sent to other brain areas that translate the movement information into sound information, creating the sound wave synchronization.

https://www.jneurosci.org/content/early/2019/12/19/JNEUROSCI.1101-19.2019

Edited

Rating

Unrated
Rating: