Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Lightning Talks

Using frequency selectivity to examine category-informative dimension-selective attention

There is a Poster PDF for this presentation, but you must be a current member or registered to attend SNL 2023 to view it. Please go to your Account Home page to register.

Poster C85 in Poster Session C, Wednesday, October 25, 10:15 am - 12:00 pm CEST, Espace Vieux-Port

Sahil Luthra1, Raha N. Razin2, Chisom O. Obasih1, Adam T. Tierney3, Frederic Dick2, Lori L. Holt1; 1Carnegie Mellon University, 2University College London, 3Birkbeck, University of London

The ability to assign complex stimuli to behaviorally relevant categories is a hallmark of cognition. In the auditory domain, listeners must leverage experience to learn how to carve up the acoustic soundscape, both in speech perception (e.g., to differentiate among phonetic categories) and for non-speech stimuli (e.g., to differentiate the cries of predator versus prey). Theoretical accounts have posited that successful auditory categorization may rely on selective attention to diagnostic auditory dimensions (e.g., Francis & Nusbaum, 2002), with the possibility that listeners may additionally suppress non-diagnostic dimensions. The current study (target N=50) leverages fMRI to investigate whether selective attention underlies auditory categorization; specifically, we examine cortical activation when categorization depends on diagnostic information conveyed in particular frequency bands. Prior to scanning, adult listeners complete a five-day training regime in which they learn to categorize four novel nonspeech auditory categories defined in a complex multidimensional space (Obasih et al., 2023). Each stimulus consists of three consecutive high-bandpass-filtered hums (1-3 kHz) and three simultaneous low-bandpass-filtered hums (0-500 Hz), where the hums are nonspeech pitch contours derived from multi-talker productions of Mandarin words varying in lexical tone. Critically, the four stimulus categories differ in which frequency band is diagnostic of category identity; in the category-diagnostic frequency band, all three hums are drawn from a single tone category, whereas in the non-diagnostic frequency band, the three hums are drawn from different tone categories. For two categories, the high band is the category-diagnostic band, and for the other two categories, the low band is the category-diagnostic band. As such, categorization depends on recognizing category-diagnostic (but acoustically variable) hum patterns within a category-diagnostic frequency band. Here, we test the hypothesis that successful categorization requires directing attention to the category-diagnostic spectral band, while potentially attentionally suppressing the non-diagnostic band. We carry this out by comparing the amplitude of activation evoked during auditory categorization within different tonotopically mapped regions (Dick et al., 2012) as well as “attention-o-tonotopic” maps driven by explicitly cued attention to high or low spectral bands (e.g., “listen high”; Dick et al., 2017). We hypothesize that successful categorization will be linked to enhanced recruitment of cortical regions that prefer the diagnostic acoustic frequency band; here, baseline activation is indexed through a control task in which participants categorize stimuli based on stimulus amplitude. Furthermore, we hypothesize that auditory selective attention involves suppression, indexed as reduced recruitment of cortical regions that prefer the non-diagnostic frequency band (relative to activity during the control task). Preliminary results indicate that, among listeners who learn the auditory categories to criterion, there is concordance between cortical activation in the auditory categorization task, stimulus-driven tonotopic maps, and tonotopic maps driven by explicit attention. These results suggest that auditory categorization may drive selective attention to category-diagnostic dimensions, highlighting a possible mechanism through which auditory experience may guide the formation of perceptual categories.

Topic Areas: Speech Perception,

SNL Account Login

Forgot Password?
Create an Account

News