Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Lightning Talks

How can the processing of sign language help the processing of spoken language? Evidence from priming

Poster C105 in Poster Session C, Wednesday, October 25, 10:15 am - 12:00 pm CEST, Espace Vieux-Port

Liangliang Li1, Yingying Tan2, Wenshuo Chang3, Xiaolin Zhou4; 1Shanghai International Studies University

Previous studies have shown that when bilinguals read words in their second language, the corresponding words in their first language are automatically activated. However, whether this activation occurs without similar phonological or orthographic features between the languages remained unclear. Unlike two spoken languages, sign language and spoken language do not share the same articulatory system, and the phonological and visual forms of sign language are very different from those of spoken language. The current study aimed to investigate how the processing of the first language (sign language) could help the processing of the second language when deaf participants read the second language (Chinese) and how their second language proficiency would modulate this potential facilitatory effect. Deaf participants were presented with prime-target word pairs with the prime word in the sign language form and the target word in written Chinese (characters). The sign primes were 1) phonological similar to the sign equivalent of the target word (sharing two or three phonological parameters: handshape, movement, location), 2) semantically related to the target, or 3) unrelated to the target. Participants were asked to make speeded lexical decisions to the target words. Results showed that target words were responded to faster and more accurately when they were preceded by semantically or phonologically related signs than by unrelated control primes. Moreover, the phonological priming effect was negatively correlated with participants’ proficiency in Chinese, with lower proficient participants showing larger priming effects. This pattern of results demonstrates that when deaf people see sign words, other sign words sharing phonological features are automatically activated. This activation could spread to lexical representations of the corresponding second language words, helping the processing of the latter. Deaf readers of the second language would benefit more because their slower processing of the second language would give them more time to receive this spreading activation. Keywords: sign language, written language, lexical representation, deaf reader

Topic Areas: Signed Language and Gesture, Phonology

SNL Account Login

Forgot Password?
Create an Account

News