My Account

Poster C79, Wednesday, August 21, 2019, 10:45 am – 12:30 pm, Restaurant Hall

Lip-reading connected natural speech in normal-hearing individuals: Neural characteristics

Satu Saalasti1,2, Jussi Alho2, Lahnakoski Juha2,3, Bacha-Trams Mareike2, Glerean Enrico2, Jääskeläinen Iiro2, Hasson Uri4, Sams Mikko2;1University of Helsinki, 2Aalto University School of Science, 3Max Planck Institute of Psychiatry, Munich, 4Princeton University

Seeing speaker's visual speech gestures (i.e. lip-reading) enhances speech perception, especially in noisy environments. However, only few readers are also skilled lip-readers while most struggle at the task. Previous brain imaging studies on lip-reading have found areas associated with auditory speech perception to be activated during silent visual speech. However, in most of these studies simplified linguistic units were used, which are far from the complexity of natural, continuous speech. In this study, we investigate the neural substrate of lip-reading of connected natural speech, during functional magnetic resonance imaging. A narrative of 8-min length was presented in three conditions: during (i) lip-reading, (ii) listening, and (iii) reading, to 29 subjects whose lip-reading skill varied extensively. The similarity of individual subjects’ brain activity within and between conditions was estimated by voxel-wise comparison of the BOLD signal time courses as inter-subject correlation (ISC). Our results show specific clusters of ISC for lip-reading in the cuneus, lingual gyri and right cerebellum, after initial visual processing. When the subjects listened to the narrative or read it, ISC was found bilaterally in temporo-parietal, frontal, and midline areas as well as specifically superior temporal areas during listening and occipital visual areas during reading. The comparison of the ISC between different conditions revealed that both lip-reading the narrative or listening to it, are supported by the same brain areas in temporal, parietal and frontal cortices, precuneus and cerebellum. Note however, that lip-reading activated only a small part of the neural network that is active during listening or reading the narrative. Thus, listening and reading a natural narrative activates the brain extensively and similarly, whereas similarity of brain activity during lip-reading vs. reading or listening the same narrative is much less extensive. Further, when analyzing the lip-reading skills of the subjects, we found that skilled lip-reading was specifically associated with bilateral activity in the superior and middle temporal cortex, which also encode auditory speech, suggesting an efficient coding of visual speech gestures by the same mechanisms used in auditory coding of phonetic speech features. Our data suggests that there is an extensively shared mechanism for lip-reading and listening to natural, connected speech, consistent with the view that comprehension of narrative speech involves both modality-specific perceptual processing as well as more general linguistic processing that may be amodal or multimodal.

Themes: Perception: Speech Perception and Audiovisual Integration, Perception: Auditory
Method: Functional Imaging

Back