Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Lightning Talks

Beyond the brain: A multimodal approach to describing the relationship between brain oscillations, heart rate, and involuntarily eye movements during auditory processing

There is a Poster PDF for this presentation, but you must be a current member or registered to attend SNL 2023 to view it. Please go to your Account Home page to register.

Poster E112 in Poster Session E, Thursday, October 26, 10:15 am - 12:00 pm CEST, Espace Vieux-Port
This poster is part of the Sandbox Series.

Wenfu Bao1, Alejandro Pérez2, Monika Molnar1; 1University of Toronto, 2University of Surrey

Introduction. Most theories describe higher-level cognition based on observing the human brain, while the role of the autonomic nervous system (ANS) is often overlooked (Porges & Furman, 2011). Here, we jointly recorded the electroencephalogram (EEG), heart rate, and involuntary eye movements when participants performed linguistic and nonlinguistic auditory tasks that varied by cognitive load. Specifically, we compared neurophysiological differences in (a) alpha-band brain oscillatory power, (b) mean heart rate and interbeat interval, and (c) microsaccade rate between linguistic (familiar vs. unfamiliar languages) and nonlinguistic (simple vs. complex musical sounds) processing, and explored the association between these attentional measures. A multimodal approach will shed light on the interactions between the neural and peripheral nervous systems to ultimately modulate behavior related to auditory processing. Methods. 70 English-speaking young adults (age range: 18–25; 35 female) completed two active listening tasks: linguistic and nonlinguistic, with 18 trials (around 15 minutes) each. During the linguistic task, participants listened to short passages spoken in a familiar (English—low cognitive load) or unfamiliar (Hebrew—high cognitive load) language while watching a video on display. They were asked to respond by button pressing whether they heard a target word. During the nonlinguistic task, participants listened to sequences of musical tones that were either simple (fewer instrument varieties—low cognitive load) or complex (more instrument varieties—high cognitive load) while watching the video and were asked to respond whether they heard a target instrument sound. For data acquisition, 32-channel EEG and heart rate were recorded with an actiCHamp Plus amplifier (Brain Products, GmbH). Eye movements were collected via an EyeLink 1000 Plus eye tracker (SR Research, Canada). Oscillatory power in the alpha frequency band was estimated using the BrainVision Analyzer software. Mean heart rate and interbeat interval were calculated using the EEG-Beats toolbox (Thanapaisal et al., 2020), and microsaccades were detected using the Microsaccade Toolbox for R (Engbert et al., 2015). Outcome Interpretation and Significance. In this study, we observe neural and physiological responses in tandem to understand the human nervous system activities more fully during auditory processing. We anticipate identifying distinct patterns linked to linguistic and nonlinguistic processing influenced by cognitive load. The multimodal approach we employed can advance theories on auditory processing that include the involvement of the ANS. Moreover, the possible associations between EEG, heart rate and eye movement data can expand the techniques of choice when addressing auditory cognition.

Topic Areas: Control, Selection, and Executive Processes, Methods

SNL Account Login

Forgot Password?
Create an Account

News