My Account

Poster C67, Wednesday, August 21, 2019, 10:45 am – 12:30 pm, Restaurant Hall

Motor engagement relates to accurate perception of phonemes and audiovisual words, but inaccurate perception of auditory words.

Kelly Michaelis1, Makoto Miyakoshi2, Andrei V. Medvedev1, Peter E. Turkeltaub1,3;1Georgetown University Medical Center, 2Swartz Center for Computational Neuroscience, University of California San Diego, 3Medstar National Rehabilitation Network

Prior studies have demonstrated motor activity during speech perception, but have not systematically examined the conditions under which the motor system is engaged, including perception of whole words and meaningful non-speech sounds. We examined an EEG signature of motor activity (sensorimotor μ/beta suppression) to test the hypothesis that motor regions are engaged when ventral stream processing mechanisms are insufficient to identify a word (e.g., during isolated syllable perception or noisy conditions) or when additional information, such as seeing the speaker, obligatorily engages motor speech systems. In contrast, we hypothesize that during unambiguous word-level perception, processing should occur solely in the ventral stream. In 24 healthy adults (mean 23.6yrs, 16 female, right-handed), we measured EEG signal during the perception of auditory single words (AudWord), auditory CVC phonemes (Phoneme), audiovisual single words (AVWord), and auditory environmental sounds (EnvSound). The task was an adaptive four-alternative forced choice task that manipulated signal-to-noise ratios to achieve two levels of difficulty for each stimulus type (Easy 80% and Hard 50% correct). EEG was recorded using a 128-channel Geodesic net. Using EEGLAB, data were preprocessed and subjected to independent components analysis, dipole fitting, and clustering. Component clusters included four a priori areas of interest: left IFG, left sensorimotor cortex, right sensorimotor cortex, and left auditory cortex. We performed time-frequency decomposition and measured stimulus-related μ/beta power (8-30Hz). A series of 2x2 mixed-effects models examined effects of condition (AudWord vs. each of the other conditions) by accuracy (Correct, Incorrect), and condition by difficulty (Easy vs. Hard, only including correct trials). Sensorimotor μ/beta suppression was left-lateralized. Within the left sensorimotor cluster, AVWord and Phoneme stimuli showed enhanced μ/beta suppression for correct relative to incorrect trials, while AudWord stimuli showed the opposite pattern: enhanced suppression for incorrect trials and synchronization for correct trials (AVWord vs. AudWord, condition*accuracy, p=.000; Phoneme vs. AudWord, condition*accuracy, p=.001). As expected, there was little modulation of μ/beta power by EnvSound (AudWord vs. EnvSound condition*accuracy, p = .042). For correct responses in the AVWord and Phoneme conditions, μ/beta suppression was observed for Easy and Hard trials, whereas in the AudWord condition, there was no suppression for Easy trials and increased power for Hard trials (AVWord vs. AudWord, main effect condition, p=.003; Phoneme vs. AudWord, condition*difficulty, p=.012). In both the left IFG and auditory clusters, μ/beta suppression was present in all conditions but was greatest for the AudWord Easy trials (IFG: AVWord vs. AudWord, condition*difficulty, p=.007; Phoneme vs. AudWord, condition*difficulty, p=.012; Auditory Cortex: AVWord vs. AudWord, main effect difficulty, p=.012; Phoneme vs. AudWord, condition*difficulty, p=.009). Our results suggest that motor involvement in perception is left-lateralized and is specific to speech. Furthermore motor engagement relates to correct perception of sublexical and audiovisual words but incorrect perception of auditory-only words. These findings support a model in which the motor system is flexibly engaged to aid perception depending on the nature of the speech stimuli, and suggests that processing auditory-only words via this mechanism is ineffective. The results also suggest that the left IFG and auditory cortex preferentially process clear lexical items.

Themes: Speech Perception, Perception: Speech Perception and Audiovisual Integration
Method: Electrophysiology (MEG/EEG/ECOG)

Back