You are viewing the SNL 2018 Archive Website. For the latest information, see the Current Website.


Poster B60, Thursday, August 16, 3:05 – 4:50 pm, Room 2000AB

Sensorimotor EEG indicates deaf signers simulate tactile properties of ASL signs when reading English

Lauren Berger1, Lorna C. Quandt1;1Gallaudet University

When a deaf bilingual reads a word in English, the corresponding ASL sign is also activated lexically (Shook & Marian, 2012), in a process called crossmodal, cross-linguistic translation. Emerging evidence suggests that when deaf signers read English, they recruit the brain’s sensorimotor system to automatically simulate the movements required to produce the corresponding ASL signs (Quandt & Kubicek, 2017). We expand upon that work to ask whether somatosensory (e.g., tactile and proprioceptive) properties of ASL signs, in addition to motor properties, are simulated when deaf signers read. Here, we examined neural activity during reading of words whose ASL translations involve Contact between the hand and body (e.g., POLICE; C words) and words whose ASL translations involve No Contact (e.g., BAG; NC words). We hypothesized that there would be greater activation in the somatosensory regions of the sensorimotor system when deaf signers read C words as compared to NC words, and that this effect would not be present for hearing non-signers. We collected EEG data from 23 hearing non-signers and 24 deaf fluent signers while they passively read English words with ASL translations that are either C or NC. For each EEG electrode within our central ROI, we performed time-frequency analyses across all alpha and beta frequencies from word onset at 0 to 1000 ms. Additionally, we conducted full-scalp analyses from all 64 electrodes that provided us with a map of alpha and beta power across the analysis epoch. In the deaf group, we found more lower beta desynchronization (14-17 Hz) from 250-500 ms following onset of NC words compared to C words. The significant effects were observed at four adjacent electrodes overlying a left centro-parietal region. Full scalp analysis revealed no significant difference between NC and C words for hearing participants, indicating there was no effect of simulating ASL signs in this group. Our ROI analysis at central electrodes overlying the somatosensory cortex revealed more lower beta and upper alpha desynchronization in response to reading NC words. Further analysis revealed that this effect was present in two clusters, with five electrodes in the right fronto-central region, and four electrodes in the left centro-temporo-parietal region. Together, these results demonstrate significant differences in somatosensory EEG activity when deaf signers read English words whose ASL translations have different tactile and proprioceptive properties. Following the idea that deaf ASL users simulate ASL signs when reading English words, our findings of a post-movement beta rebound (PMBR) in response to reading NC words reveal that this simulation also includes the somatosensory features of ASL signs. PMBR is an oscillatory phenomenon that occurs after the offset of a movement, displayed by a sharp increase in beta power, reaching full power over the sensorimotor cortex (Parkes, Bastiaansen, & Norris, 2006), indicative of the involvement of somatosensory processing areas. We demonstrate that Deaf readers simulate the tactile and proprioceptive properties of ASL signs corresponding to the English words they read, and that this simulation invokes a greater post-movement beta rebound for no-contact signs than for contact signs.

Topic Area: Signed Language and Gesture

Back