You are viewing the SNL 2018 Archive Website. For the latest information, see the Current Website.


Poster Slam Session C
Friday, August 17, 10:15 – 10:30 am, Room 2000C, Chair: Mairéad MacSweeney

The neurobiology of Braille reading beyond the VWFA

Judy Sein Kim1, Erin Brush1, Shipra Kanjlia1, Marina Bedny1;1Johns Hopkins University

According to the cultural recycling hypothesis, the neural basis of culturally acquired skills is scaffolded on evolutionarily older neural mechanisms (Dehaene & Cohen, 2007). Reading, for instance, is believed to “recycle” elements of the visual object recognition and language systems. What aspects of reading determine which older systems get co-opted? To gain insight into this question, we examined the neurobiology of reading by touch in proficient blind readers of Braille. Previous studies suggested that like sighted readers of print, blind Braille readers develop a “visual word form area” (VWFA) in the ventral object-processing stream (Reich et al., 2011). Contrary to this hypothesis, we recently showed that in congenitally blind individuals, the anatomical location of the “VWFA” responds to the grammatical complexity of spoken sentences (Kim et al., 2017). These results raise several questions. What are the neural systems that support Braille reading? Are there any specialized neural responses to Braille orthography? If so, are these responses localized in the somatosensory, visual, or fronto-temporal language network? To look for candidate Braille-specific regions, we examined data from a group of blind individuals (9F/1M) who took part in three experiments (Braille word reading, auditory sentence processing, and Braille letter priming). In Experiment 1, participants read Braille words, consonant strings, tactile shapes, and listened to auditory words and words played backwards (from Kim et al., 2017). In Experiment 2, participants listened to sentences with a syntactic difficulty manipulation as well as to nonword lists (from Lane et al., 2015). In Experiment 3, strings of single letters and strings of shapes were presented, where strings either had low variance (letters/strings repeated) or high variance (all different letters/strings). Braille word reading relative to listening to backward speech (Experiment 1) activated a wide set of regions in the left hemisphere, including the ventral occipito-temporal, early visual, somatosensory, parietal, and frontal cortices (FDR corrected, p<0.05). A smaller subset of the same regions were found in the right hemisphere. The letter strings > shape strings contrast (Experiment 3) activated the angular gyrus, which has previously been implicated in orthographic working memory (Rapp et al., 2016). Nearly all Braille word-responsive regions both in fronto-temporal cortices and in “visual” cortices displayed sensitivity to high-level language: Braille words > Braille consonant strings > tactile shapes, auditory words > backward sounds, and grammatically complex > grammatically simple sentences. Interestingly, the left somatosensory cortex displayed a different profile: (Braille words = consonant strings) > tactile shape strings, auditory words = backward sounds, and auditory sentences = nonwords lists (t-tests, significant if p<0.05). The right somatosensory cortex only showed preference for tactile over auditory stimuli. These findings suggest that there are important differences in the neural bases of reading in sighted versus blind readers. We hypothesize that both the modality of word recognition (visual vs. tactile) as well as prior visual experience (congenital blindness vs. growing up with sight) shape the neurobiology of reading. This research provides insights into how experience interacts with innate predispositions to shape the neural basis of cultural systems.

Topic Area: Perception: Orthographic and Other Visual Processes

Poster C14

Back to Poster Slam Sessions