You are viewing the SNL 2018 Archive Website. For the latest information, see the Current Website.


Poster C9, Friday, August 17, 10:30 am – 12:15 pm, Room 2000AB

Do you see what i am saying? An EEG study about the perception of visual speech.

Maëva Michon1, Gonzalo Boncompte1, Vladimir López Hernández1;1Pontifical Catholic University of Chile, School of Psychology

The current study examines the electrophysiological correlates of the perception of linguistic versus non-linguistic orofacial mouvements and investigates the role of automatic mimicry in the processing of visual speech. To this end, participant’s were recorded with a 64-Channel EEG device while they attentionally observed or imitated short videos displaying 4 types of orofacial mouvements (i.e., still mouth, syllables, backward played syllables and non-linguistic mouth mouvements) and non-biological mouvements. In order to study the role of automatic mimicry the very same experiment was repeated and participants were asked to hold an effector depressor horizontally between their teeth. The ERP results showed a significant effect for the perception of visual speech. Namely, the amplitude was decreased in N1 component and increased in P2 component for syllables versus non-linguistic mouvements. Interestingly, this effect was no longer significant when imitative behavior was disrupted by effector depressor. These findings are in line with behavioral studies reporting an important contribution of visual cues and automatic mimicry for speech perception and comprension.

Topic Area: Perception: Speech Perception and Audiovisual Integration

Back