Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Lightning Talks

Beyond central vision: Unique peripheral word and face processing abilities in deaf signers

There is a Poster PDF for this presentation, but you must be a current member or registered to attend SNL 2023 to view it. Please go to your Account Home page to register.

Poster D116 in Poster Session D, Wednesday, October 25, 4:45 - 6:30 pm CEST, Espace Vieux-Port

Zed Sevcikova Sehyr1, Sofia E. Ortega1, Katherine J. Midgley1, Phillip J. Holcomb1; 1San Diego State University

Words are most efficiently processed in central vision although readers also utilize information in the parafovea to preprocess upcoming words during reading. Congenitally deaf people proficient in sign language have consistently shown larger reading spans than hearing readers (Belanger et al., 2012), reflecting an efficiency in allocating attention and integrating information presented simultaneously across the visual field during signing. This study examined the ability of readers to process words in the periphery using an ERP repetition priming paradigm. The primes were presented in the central vision and were fixated, followed by target stimuli in the periphery (left or right). Eccentric targets were not fixated and participants who consistently made saccades to the targets were excluded. Participants made same-different judgments by pressing a response button while we recorded EEGs from 29 scalp electrodes. The handedness of responses was counterbalanced across participants. We additionally included faces and car stimuli in separate blocks to assess whether enhanced visual processing capacity in deaf readers is specific to words or generalizes to non-orthographic objects. We focused on priming effects on the N400 component which we predicted should be sensitive to any lexico-semantic processing of peripherally located targets. The study included 38 adults, comprising 19 deaf American Sign Language signers and 19 hearing English speakers. Results revealed semantic priming effects as indexed by the N400 component for words, with unrelated word pairs generating more negative-going ERPs between 300-450ms than repeated pairs. However, this effect was more pronounced and consistent across participants in the deaf group, suggesting an increased ability to extract and integrate meaning from spatially and temporally distinct word stimuli. Secondly, unrelated face pairs also elicited a more negative-going deflection between 300-450ms than repeated face pairs in both groups, indicating similar peripheral processing benefits to faces. Notably, the face priming effect lasted longer in the deaf group, perhaps due to the signers’ linguistic experience with facial expressions. Further, both groups showed priming effects for words and faces earlier when eccentric stimuli were presented to the right and left visual fields respectively, supporting the expected left hemisphere dominance for words and right hemisphere dominance for faces. Finally, deaf and hearing participants exhibited similar time course and distribution of N400 repetition effects to cars. Accuracy and reaction times were collected for all stimulus types and aligned with these patterns. Overall, deaf readers showed enhanced, perhaps more resilient lexico-semantic processing of words in the periphery across the two visual hemifields, while hearing readers may do so inconsistently. Although these results are promising, more work is needed to explore the exact mechanisms involved. The study furthers our understanding of the neurobiological basis of language and visual processing in readers with atypical sensory-perceptual experiences.

Topic Areas: Reading, Signed Language and Gesture

SNL Account Login

Forgot Password?
Create an Account

News