You are viewing the SNL 2018 Archive Website. For the latest information, see the Current Website.

Poster D70, Friday, August 17, 4:45 – 6:30 pm, Room 2000AB

Flexible meaning: the neuromodulation of noun meaning by a prior verb

Bingjiang Lyu1, Alex Clarke1, Hun Choi1, Lorraine Tyler1;1Department of Psychology, University of Cambridge

During the incremental interpretation of spoken language each word that is heard plays a dual role – it is integrated into the ongoing sentential context and places constraints on upcoming words. In the present study we focus on how verbs constrain the semantics of upcoming words by investigating the neural processes that underpin the generation of a verb’s semantic constraints [its selectional preferences]and how they modulate the semantics of an upcoming Direct object noun (DO). To do this, participants listened to a set of 360 sentences of the form “subject noun phrase + verb + DO noun” comprised of triplets of verbs and nouns, enabling us to vary the verb’s semantic constraints on the DOnoun. Within each triplet we varied the specificity of the verb’s semantic constraints. Eg in “The boy tasted the mushroom” the verb constrains towards an edible DO, whereas in “The boy liked the mushroom” the verb provides less specific constraints on the DO. We modelled the semantics of each verb and DOnoun using a state-of-the-art corpus-based topic modelling procedure based on Latent Dirichlet Allocation (LDA). This describes each verb’s semantic vector as a probability distribution over semantic topics, reflecting the semantic constraints on the verbs’ direct object slots via topics. Differences in these distributions across verbs are then tested against the brain data using RSA . A model of the semantics of the DOnouns were calculated in a similar way. Using these methods we constructed model RDMs [Representational dissimilarity matrices] of the semantic representation of each verb and noun in the sentence, together with entropy and surprisal models of verb semantics and tested them against the EMEG data. The RDM of the representation of a verb’s semantic constraints (Vsem) was constructed by calculating the cosine distance between verb topic vectors. The specificity of verb’s semantic constraints was captured by the entropy of each verb’s topic vector and was used to construct verb constraint strength RDMs (VH). RDMs of semantics of the nouns (Nsem) were calculated in the same way. Verb-weighted noun semantic vector was obtained through element-by-element multiplication between normalized verb topic vector and corresponding noun vector, which was used to construct verb-weighted noun semantics (VWNsem) RDM. These model RDMs were tested against data RDMs from both verb and noun epochs. ssRSA was conducted using 10mm-radius searchlights in bilateral language masks with 60ms sliding time-window length. The results were corrected for multiple comparisons using cluster permutation test (1000 permutations, vortex-wise p<0.05, cluster-wise p<0.05). The results show significant and early effects of the verb’s semantic representations starting before the verb UP, as the target verb emerges from its word initial cohort, and persisting throughout the noun in bilateral MTG/LAG. Around 100msec later, in LMTG, we see effects of the strength/specificity of the verb’s constraints which also persist throughout the noun. Finally, we found strong effects of the verb’s modulation of the DOnoun semantics starting from the onset of the noun until its IP involving LMTG, showing the early modulation of a word’s meaning by the prior context.

Topic Area: Computational Approaches