You are viewing the SNL 2018 Archive Website. For the latest information, see the Current Website.


Poster D67, Friday, August 17, 4:45 – 6:30 pm, Room 2000AB

ERPs in multimodal language comprehension: How discourse information and synchrony influence gesture-speech processing

Isabella Fritz1,3, Sotaro Kita2, Jeannette Littlemore3, Andrea Krott3;1Norwegian University of Science and Technology (NTNU), 2University of Warwick, 3University of Birmingham

Previous studies have often suggested that gesture-speech synchrony is crucial for the successful semantic integration of iconic gestures into a discourse model due to the ambiguous nature of gestures. The presence of an N400 effect was used as an indicator of integration when comparing match vs. mismatch gestures in relation to speech. But not all gestures synchronise with semantic affiliates (i.e., element(s) in speech related to the gesture’s meaning); some precede them. In an ERP study, we tested if preceding verbal discourse that constrains a listener’s interpretation of an iconic gesture enables asynchronous gestures to be integrated into a listener’s discourse model. We created two-sentence stimuli where the introductory sentence either contained information that was semantically related to the gesture’s meaning (“Some of the strawberries in the garden were already ripe.”) or unrelated (“At the beginning of the week the weather was dreadful.”). In the target-sentence (“I saw that my uncle … picking …”), the gesture was placed on a content word at the beginning that could not guide the gesture’s interpretation (uncle) whilst the target-verb further downstream either semantically matched (“picking”) or mismatched (“watering”) the gesture. For ERPs time-locked to the gesture’s onset, we found an anterior-left negativity with a more negative deflection in the Unrelated Discourse condition (starting at 800 ms). Despite its late onset, which is not uncommon for gesture processing, this effect suggests a more effortful search for a referent in the Unrelated Discourse because discourse information did not provide cues for gesture interpretation. For ERPs time-locked to the target-verb, we did not observe an effect of semantic congruency between the gesture and the verb in the N400 time-window in either Discourse condition. This might be because the gesture does not prime the target-verb’s meaning because the gesture’s meaning is vague in both Discourse conditions, an explanation in line with results from a behavioural experiment using the same stimuli set. ERPs time-locked to the target-verb showed P600-like mismatch effects for semantic congruency between the gesture and the verb in both Discourse conditions but with different topographical distributions (posterior – Related Discourse; anterior – Unrelated Discourse). We interpreted the P600-like effects as a reanalysis of the gesture’s meaning that was triggered after lexical retrieval of the target-verb, i.e. when the verb was mapped onto the discourse model. The different topographies might result from different reanalysis processes based on preceding discourse information, i.e. gestures are perceived as incongruous in the Related Discourse and as unexpected or a poor fit in the Unrelated Discourse condition (cf. van Petten & Luka, 2012). The study suggests that synchronisation between gesture and semantic affiliate is not essential for gesture-speech unification. However, different integration processes seem to occur when gestures are not in synchrony with the semantic affiliate. Based on our findings and the Retrieval Integration Account by Brouwer et al. (2012), we distinguish three different integration processes for asynchronous gestures: search for a referent in preceding discourse (Nref), context driven meaning construction/semantic lexical retrieval (N400), post-semantic integration into a discourse model (P600).

Topic Area: Signed Language and Gesture

Back