You are viewing the SNL 2017 Archive Website. For the latest information, see the Current Website.

Poster D74, Thursday, November 9, 6:15 – 7:30 pm, Harborview and Loch Raven Ballrooms

Manual directional gestures facilitate learning of Mandarin tones

Anna Zhen1,2, Stephen Van Hedger1, Shannon Heald1, Susan Goldin-Meadow1, Xing Tian2;1The University of Chicago, 2New York University Shanghai

Action and perception interact in complex ways to shape our cognition. For example, gestures can impact learning and verbal communication, especially among non-native speakers. However, the processes through which motor and visual gestural information can influence auditory learning are still not entirely clear. We hypothesize that this cross-modal learning benefit is caused by the common representation of certain features such as direction among motor, visual, and auditory domains. To test this hypothesis, the present study examined the role of manual and visually displayed directional pitch gestures in helping native English speakers learn the tones in Chinese vowels and words. Two types of hand gestures were included – iconic pitch gestures that mimic the directional dynamics of 4 Chinese lexical tones and rotated pitch gestures that are generated by rotating the iconic pitch gestures 90 degrees. Moreover, we parametrically manipulated the involvement of modalities (motor, visual and auditory) activated during learning by assigning participants to conditions in which they performed pitch gestures, watched videos of pitch gestures, or did not receive any gesture information. 65 participants received one of five types of training yielded by the factors of gesture type and training modality: 1) performing iconic pitch gestures (3 modalities, visual, auditory, and motor modalities), 2) performing rotated pitch gestures (3 modalities, visual, auditory, and motor), 3) watching pitch gestures (2 modalities, visual and auditory), 4) watching rotated pitch gestures (2 modalities, visual and auditory), or 5) no gestural (motor or visual) information (1 modality, auditory). The learning effects were quantified by the improvement in tone identification performance either immediately or a day after training. Tones recorded by different speakers were also used to test whether learning can be generalized. We found that participants who performed pitch gestures, regardless of iconic or rotated gestures during learning outperformed the other groups in identifying the tones in Chinese vowels. Moreover, these learning effects generalized to novel, untrained monosyllabic Chinese words. In contrast, participants who learned by watching rotated gestures were not any better at identifying the tones in Chinese vowels and words than participants who were given auditory training only. The accuracy rates in these two groups were the lowest of the five training groups. These results provide support that motor and sensory systems may have the same coordinates representing directions and that reinforcement across different modalities help individuals direct attention to the relevant acoustic features for learning.

Topic Area: Speech Motor Control and Sensorimotor Integration

Back to Poster Schedule