Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Poster Slams

Interactive gestures as response mobilizing cues? Evidence from corpus, behavioral, and EEG data

Poster C16 in Poster Session C, Friday, October 7, 10:15 am - 12:00 pm EDT, Millennium Hall
This poster is part of the Sandbox Series.

Alexandra Emmendorfer1,2, Anna Gorter3, Judith Holler1,2; 1Donders Institute for Brain, Cognition, and Behaviour, 2Max Planck Institute for Psycholinguistics, 3Radboud University

In face-to-face communication, speech is often accompanied by co-speech gestures. There is a growing body of experimental research dedicated to the processing of multimodal language relating to representational gestures, which depict actions and objects or point towards fictive referents (e.g., ter Bekke et al., 2020) or beat gestures, which are visual indicators of emphasis (e.g., Biau et al., 2018). However, non-representational interactive gestures (Bavelas et al., 1992; 1995) remain largely unexplored, especially regarding their potential effect on comprehension. Interactive gestures are thought to aid in the coordination of dialogue, for example by referring to the addressee with an addressee-directed point, or with an open hand, palm facing up (palm-up open hand, PUOH) or by presenting information to the addressee with an open hand, palm to the side (palm lateral, PL). We aim to explore whether interactive gestures may act as response mobilizers by making the addressee “feel addressed”, thus leading to faster responding/response planning, a vital component of language use in conversational turn-taking (Levinson, 2016). We do so by examining data from a multimodal corpus of Dutch conversations, and developed behavioral and EEG experiments to test our hypotheses. Corpus analysis of 1692 questions (requests for information) revealed 8.3% of questions were accompanied by non-representational gestures, with the majority of these gestures falling in the category of palm-up open hand gestures (PUOH 47.1%). Following prior observations that questions with gestures get faster responses (Holler et al., 2018, ter Bekke et al., 2020), we examined whether this was also the case for non-representational gestures using a subset of the interactive gesture forms (pointing, n = 23; PUOH, n = 35; PL, n = 9). While a small difference in median turn gap duration was observed, where questions with these interactive gestures received faster responses (median turn gap: 364ms) compared to questions without gestures (median turn gap: 425 ms), this was not statistically significant. Given that the corpus data contains a multitude of overlapping visual and auditory signals, and the relatively small number of observations for these specific gesture forms, an effect of interactive gestures on response time may be masked by other signals. We therefore designed two experimental studies (online behavioral, EEG) to investigate whether interactive gestures influence response time in a controlled experimental setting. In both experiments participants respond to 240 polar questions (yes/no) by means of button press. Questions are asked by an animated avatar, allowing full control over bodily signals. In 120 questions, the question is accompanied by one of 3 gestures: pointing (40), PUOH (40), and PL (40). The remaining 120 questions were not accompanied by a gesture and served as the control condition. We hypothesize faster responses in questions with interactive gestures compared to the no gesture controls, and explore whether this differs for the different gesture conditions. EEG analyses will focus on the lateralized readiness potential (LRP), providing an indicator of how early these differences in response preparation arise. Data collection will launch in June, and we expect to present preliminary results at the SNL meeting.

Topic Areas: Perception: Speech Perception and Audiovisual Integration, Signed Language and Gesture