Slide Slam I7
Self-paced reading and ERP studies on Chinese sentence processing by prelingually Deaf learners
Qi Cheng1, Xu Yan1, Lujia Yang2, Hao Lin3; 1University of Washington, 2Shanghai Jiao Tong University, 3Shanghai International Studies University
Deaf individuals often have limited access to auditory language input, resulting in incomplete acquisition of the spoken/written language. In particular, Deaf readers tend to be less sensitive to syntactic violations compared to semantic violations (Skotara et al. 2012; Mehravari et al. 2017). Previous studies suggest that language learners with limited proficiency, including deaf signers with impoverished early language (Cheng & Mayberry 2020), often show over-reliance on semantic cues when there are conflicting cues in the sentence. Studying how prelingually Deaf learners resolve conflicting cues in real time can shed light on the nature of the language learning/processing difficulties in this population, and also reveal the role of fully accessible early language on brain development. The current study aims to examine how prelingually Deaf adults in China process written Chinese sentences with canonical and reversed semantic roles. We have two research questions: 1) Are prelingually Deaf learners more likely to rely on event plausibility when comprehending sentences with reversed semantic roles? 2) If so, what are the real-time processing mechanisms that contribute to their comprehension performance? We first gathered behavioral data using a self-paced reading task (key pressing with moving window, implemented in Ibex), with a sentence plausibility question (‘Does this sentence make sense?’) following each sentence. We used a factorial design with two factors, animacy order (Animate-inanimate, AI vs. Inanimate-animate, IA) and Plausibility (Plausible vs. Implausible), using the SOV BA-construction and the OSV BEI-construction in Mandarin, with 15 trials in each condition and equal number of fillers. Compared to the hearing controls (N=42, mean age=25.0), Deaf participants who are prelingually Deaf and did not use hearing aids before age 3 (N=39, mean age=31.21) showed lower accuracy especially when the sentences are implausible (t=-2.470*). This suggests that similar to previous findings, Deaf readers rely more on event plausibility even when it conflicts with the syntactic cue. However, their reading time were longer at the key verb region when the sentence is implausible (t=4.525***), with an additional spillover effect in the following word (t= 1.973*), even when their responses were incorrect. In contrast, hearing participants showed no slowdown across all conditions. Instead of entirely missing the syntactic cues, Deaf readers showed increased processing difficulties, which may contribute to their lower accuracy. These findings suggest that prelingually Deaf learners are sensitive to syntactic cues, but they struggle with resolving conflicting cues in real time. To further examine the processing difficulties, we are currently conducting an ERP experiment using SOV BA-construction sentences that are either plausible, semantically reversed, or with an incompatible verb. We aim to recruit 20 Deaf and 20 hearing participants. Given the observed processing difficulties from the self-paced reading task, there are two possibilities for the semantically reversed sentences. If the processing difficulties come from reanalyzing the sentence, we would expect an increased P600 effect in the Deaf group when compared to the hearing group. Alternatively, if the processing difficulties come from lexical surprisal effects of the verb, then we should expect an increased N400 effect.