My Account

Poster A67, Tuesday, August 20, 2019, 10:15 am – 12:00 pm, Restaurant Hall

Auditory Language Processing in the Visual Cortex of Blind Individuals

Kiera O'Neil1, Aaron Newman1;1Dalhousie University

Introduction: Early and sustained lack of visual input leads to an adaptive re-shaping of brain organization (Kupers & Ptito, 2014). For example, during speech (auditory) language processing, people who are blind show activation of the visual cortex, specifically the primary visual cortex (V1)—an area not typically activated in sighted people. This has been observed for tasks involving verb generation and verbal memory (Amedi, Raz, Pianka, Malach, & Zohary, 2003), semantic and phonological processing (Burton, Diamond, & McDermott, 2003), and sentence comprehension (Bedny, Pascual-Leone, Dodell-Feder, Fedorenko, & Saxe, 2011). It is not clear what functional role or roles V1 plays in language processing in people who are blind. The purpose of this study was to clarify the role of the visual cortex in auditory language processing in early blind adults by comparing activation across tasks involving different types of language processing (semantic/phonological) and across tasks that vary in their verbal working memory demands. Methods: We have collected fMRI data from two groups of participants, early blind adults (n = 6, mean age = 44.1) and sighted controls (n=10, mean age = 44.6). Participants made decisions about the relationship between auditorily presented pairs of words in three categories – semantic (meaning), phonological (rhyme) and perceptual control (speaker). As well, two n-back conditions were presented, in which participants indicated whether a word in a list matched the immediately preceding word (1-back) or the word two before (2-back). Results: Both blind participants and sighted controls showed activity in similar left frontal and temporal language regions for the semantic (middle frontal gyrus, Broca’s area, inferior temporal gyrus, temporal fusiform gyrus), phonological (middle and inferior frontal gyrus, precentral gyrus, fusiform gyrus) and perceptual conditions (middle frontal gyrus, Broca’s area, inferior temporal gyrus, fusiform gyrus). As well, similar activity in both groups was also observed for the 1-back (inferior temporal gyrus, superior frontal gyrus, supramarginal gyrus, superior parietal lobe) and 2-back conditions (frontal pole, superior frontal gyrus, supramarginal gyrus, angular gyrus, superior temporal gyrus). However, only the blind group reliably recruited the visual cortex (both primary and extrastriate regions)— and only for the semantic, phonological and n-back conditions, but not the perceptual control condition. No significant difference in visual cortex activity between the semantic and phonological conditions was found. To determine if the visual cortex is primarily involved in verbal working memory, we compared activity in the 1-back condition to the 2-back condition (which increases the verbal working memory demands). For the blind participants, no difference in visual cortex activity between the 1-back and 2-back conditions was observed. Conclusion: Our results to date indicate that the visual cortex is involved in both semantic and phonological processing in people who are blind, but not auditory tasks that are perceptual and not linguistic in nature. Additionally, while previous work has suggested the visual cortex may primarily be involved in verbal working memory in those who are blind, our results provide no evidence that verbal working memory load modulates visual cortex activity.

Themes: Speech Perception, Perception: Speech Perception and Audiovisual Integration
Method: Functional Imaging

Back