Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Poster Slams

Using RIFT to study the role of lower frequency oscillations in sensory processing and audiovisual integration

Poster B50 in Poster Session B and Reception, Thursday, October 6, 6:30 - 8:30 pm EDT, Millennium Hall

Noor Seijdel1, Jan-Mathijs Schoffelen2, Peter Hagoort1,2, Linda Drijvers1; 1Max Planck Institute for Psycholinguistics, 2Donders Institute for Brain, Cognition and Behaviour

During communication in real-life settings, our brain needs to integrate auditory information (such as speech) with visual information (visible speech, co-speech gestures) in order to form a unified percept. In addition, in order to efficiently understand another person, we need to select the relevant sources of information while preventing interference from irrelevant events (other people talking, meaningless movements). In the current study, we use rapid invisible frequency tagging (RIFT) and magnetoencephalography (MEG) to investigate whether the integration and interaction of audiovisual information might be supported by low-frequency phase synchronization between regions. We presented participants with videos of an actress uttering action verbs (auditory; tagged at 58Hz) accompanied with iconic gestures. To manipulate spatial attention, we included an attentional cue and presented the visual information with different tagging frequencies left and right of fixation (visual; attended stimulus tagged at 65Hz; unattended stimulus tagged at 63 Hz). Integration difficulty was manipulated by lower-order auditory factors (clear/degraded speech) and higher-order visual factors (congruent/incongruent gesture). Results indicated that gestures hindered comprehension when the actress performed a mismatching gesture and speech was degraded. In the MEG, we identified spectral peaks at the individual (58/63/65Hz) tagging frequencies. Beamformer source analyses indicated strongest coherence between a dummy modulation signal at occipital regions for the visually tagged signals (63 and 65 Hz) with stronger coherence for the attended frequency (65 Hz) when speech was clear and stronger coherence for the unattended frequency (63 Hz) when speech was degraded. For the auditory tagged signal (58 Hz) coherence was strongest at temporal regions, specifically when speech was degraded. To evaluate whether the integration of audiovisual information is supported by low-frequency phase synchronization, we used those (subject-specific) sensory regions to evaluate source-level connectivity between four virtual channels (left visual, right visual, left auditory, right auditory). Moreover, we observed an intermodulation frequency (7 Hz) as the result of the interaction between the attended visual frequency-tagged signal (65 Hz) and the auditory frequency tagged signal (58 Hz). Linking lower-frequency phase to the ease or success of lower-order audiovisual integration, this study aimed to investigate the dynamic routing of information during audiovisual integration and the information flow between relevant brain areas. Combining rapid invisible frequency tagging with multiple sources of audiovisual information enabled us to investigate the brain’s response to multiple competing stimuli and their integration, and in turn lays the groundwork for future studies using more natural communication paradigms.

Topic Areas: Perception: Speech Perception and Audiovisual Integration, Multisensory or Sensorimotor Integration