Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Lightning Talks

Decoding auditory feedback during speech production with fMRI

There is a Poster PDF for this presentation, but you must be a current member or registered to attend SNL 2023 to view it. Please go to your Account Home page to register.

Poster E67 in Poster Session E, Thursday, October 26, 10:15 am - 12:00 pm CEST, Espace Vieux-Port
This poster is part of the Sandbox Series.

Abigail Bradshaw1, Clare Press2,3, Clément Gaultier1, Matt Davis1; 1University of Cambridge, 2Birkbeck, University of London, 3University College London

Dominant models of speech motor control assume the primacy of a prediction error coding scheme for representing speech sensory feedback i.e. one in which top-down predictions are subtracted from bottom-up sensory feedback, with neural activity representing the remaining unexpected features. However, an alternative coding scheme for prediction exists in the perceptual processing literature, and has recently been applied to motor control in non-speech domains (Yon et al., 2018). These ‘sharpening’ accounts propose that predictions are instead used to enhance representation of expected features in sensory input. Observations of suppressed univariate activity when speaking with veridical versus altered auditory feedback have been interpreted as evidence for prediction error coding within the field of speech motor control. However, a sharpened representation would likely be manifest also as an overall reduction in the summed magnitude of activation within an area due to a more precise representation, with lower representation of noise. Speaking-induced suppression in univariate activity is therefore predicted by both accounts, making this type of evidence unsuitable for discriminating between them. In speech perception, a unique neural signature of prediction error coding has been demonstrated using multivariate decoding methods that look at the content of auditory cortex representations (Blank & Davis, 2016; Sohoglu & Davis, 2020). In the current planned experiment, we aim to test for the presence of the same neural signature during speech production (i.e. processing of self-produced speech auditory feedback). Specifically, this requires crossing the usual manipulation of speech expectedness with a manipulation of speech clarity. If auditory cortex activity encodes prediction error, the effect of clarity on decoding success will depend on the level of expectedness. At low expectedness, increasing clarity will result in a greater mismatch with predictions yielding a more informative prediction error and thus better decoding; conversely, at high expectedness, increasing clarity will result in a more precise match with predictions yielding greater cancellation and thus poorer decoding. Alternatively, if auditory cortex uses a sharpening coding scheme, expectedness and clarity should have additive effects, such that decoding of activity patterns is always most successful for stimuli that are clearer, regardless of how expected they are. In our planned work, we will first investigate the effects of expectedness and clarity during speech production on participant’s explicit reports of (1) the clarity of their speech feedback and (2) detection of changes to their speech feedback. For these perceptual measures, we predict additive effects of clarity and expectedness, as seen for perception of speech produced by others. Expectedness will be manipulated using real-time perturbation of formants as in altered auditory feedback experiments. Clarity will be manipulated using real-time noise vocoding, a technique that degrades the level of spectral detail in the speech signal. We will then use this paradigm in an fMRI study, to investigate the effect of clarity and expectedness on decoding of auditory feedback representations within auditory cortex during speech production. Additive effects would provide evidence for a sharpening coding scheme, whereas an interaction would be uniquely predicted by a prediction error account.

Topic Areas: Speech Motor Control, Speech Perception

SNL Account Login

Forgot Password?
Create an Account

News