Slide Slam

< Slide Slam Sessions

Slide Slam E10

Graph and not vector-embedding models: Computational mechanisms for neural representation of words

Slide Slam Session E, Tuesday, October 5, 2021, 5:30 - 7:30 pm PDT Log In to set Timezone

Ze Fu1,2, Xiaosha Wang1,2, Xiaoying Wang1,2, Huichao Yang1,2, Jiahuan Wang1,2, Tao Wei1,2, Xuhong Liao3, Zhiyuan Liu4, Yanchao Bi1,2,5; 1State Key Laboratory of Cognitive Neuroscience and Learning & IDG, McGovern Institute for Brain Research, Beijing Normal University, Beijing, 2Beijing Key Laboratory of Brain Imaging and Connectomics, Beijing Normal University, Beijing, 3School of Systems Science, Beijing Normal University, Beijing, 4Department of Computer Science and Technology, Tsinghua University, Beijing, 5Chinese Institute for Brain Research, Beijing

A critical way for humans to acquire knowledge is through language, yet the underlying computation mechanisms through which language contributes to our word meanings are poorly understood. We compared three major types of computation mechanisms that derive word-relational structure from a large language corpus (simple co-occurrence, graph-space relation and vector-space relation) in terms of the association of words with brain activity patterns, measured by two functional magnetic resonance imaging (fMRI) experiments. Word relations derived from a graph-space representation, and not the other two types, had unique explanatory power for the neural activity patterns in brain regions that have been shown to be particularly sensitive to language processes, including the anterior temporal lobe (capturing graph-common-neighbor), inferior frontal gyrus, and posterior middle/inferior temporal gyrus (capturing graph-shortest-path). These results were robust across different language co-occurrence measuring window sizes and graph sizes and were relatively specific to language inputs, as they were not associated with stimuli structures that had the same computations derived from visual co-occurrence statistics. These findings highlight the role of cumulative language inputs in shaping word meaning representations in this set of brain regions and provide a mathematical model to explain how they capture different types of language-derived information.

< Slide Slam Sessions

SNL Account Login

Forgot Password?
Create an Account

News