Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Lightning Talks

Online Eye-Tracking for Clinical Research: A Validation Study with Aphasia Patients

Poster C56 in Poster Session C, Wednesday, October 25, 10:15 am - 12:00 pm CEST, Espace Vieux-Port
This poster is part of the Sandbox Series.

Willem van Boxtel1, Rylee Manning1, Michael Linge1, Lily Haven1, Jiyeon Lee1; 1Purdue University

Background: Eye-tracking presents a useful tool to investigate the neural bases of human language processing in healthy and impaired populations. However, results from clinical eye-tracking studies involving small sample sizes bear weak generalizability. Instead, a reliable online paradigm widens the accessibility of clinical research. Slim and Hartsuiker (2022) and Wisiecka et al. (2022) revealed promising results to shift eye-tracking measures online through webcam technology in healthy adults, despite recording delays of up to 300ms online. However, these studies did not compare web-based vs. in-lab eye-tracking within participants. Additionally, no study has yet established the feasibility of using web-based eyetracking to investigate language comprehension processes in adults with aphasia. Using a within-subject design, we investigate (1) whether results from online eye-tracking are comparable in spatio-temporal accuracy for a sentence-picture matching task in persons with aphasia (PWA) and controls; and (2) what methodological issues arise from online eye-tracking with PWA. Data collection is ongoing. Online and in-person auditory picture-matching tasks are presented to PWA and age-matched controls in a counterbalanced order. Auditory questions are presented while two pictures (a target and foil) are shown on-screen. For the in-person task, an EyeLink 1000 Plus tracker is used. For online eyetracking, the WebGazer.js algorithm running in Gorilla.sc is used to track the eyes through participants' webcams. Participants are hypothesized to look more at the target picture after the auditory prompt disambiguates the choice between the two pictures. However, PWA may show slower, less pronounced looks to the target, and given previous results (Slim & Hartsuiker, 2022) disambiguation effects may be delayed in web compared to lab data by as much as 300ms. Preliminary data from a double case study including one PWA and one control suggest comparable results between modalities. Both lab and web eye-tracking are sensitive to uncovering differences between healthy and impaired participants. Temporal accuracy in the lab is greater than on the web (500Hz vs 60Hz). Nevertheless, both modalities show participants looking more at the target picture as sentence comprehension unfolds, suggesting effects of sentence disambiguation are evident from both systems. The included control showed faster and stronger disambiguation effects, with looks to the Target averaging >80% around 5 seconds post-trial onset in both the lab and web experiments. The included PWA reached >60% looks to the target picture around 8 seconds post-onset in both modalities. Preliminary results suggest that both eye-tracking modalities are comparably sensitive to between-participant differences, despite the greater temporal and spatial resolution of lab-based compared to online eye-tracking. If the pooled results do not reveal a cross-modal temporal delay, virtual eye-tracking may expand the breadth of research seeking to shift to online measures and may therefore lead to far greater reliability and generalizability of studies involving PWA. Both modalities continuing to yield comparable results would suggest that virtual eye-tracking holds promise for incorporating larger sample sizes and encompassing a further reach for aphasia research, which would have highly important implications for the field at large.

Topic Areas: Disorders: Acquired, Methods

SNL Account Login

Forgot Password?
Create an Account

News