My Account

Poster Slam Session B, Tuesday, August 20, 2019, 3:00 – 3:15 pm, Finlandia Hall, Angela Grant

Beyond Phrase Structure: An Alternative Analysis of Brennan and Hale (2019) Using a Dependency Parser

Phillip M. Alday1, Ingmar Brilmayer2;1Max Planck Institute for Psycholinguistics, 2University of Cologne

Recently, Brennan & Hale (2019) published an analysis of EEG data recorded while participants listened passively to an audio recording of Alice in Wonderland. They found that part-of-speech predictions based on a hierarchical phrase structure model correlate with human EEG activity more strongly than sequential models of syntactic structure in everyday language (cf. Brennan et al., 2016; Hale et al., 2018). In particular, B&H compared model fit of a sequential trigram model, a simple three-level recurrent neural network (SRN; Frank et al., 2015) and a context-free grammar (CFG) that explicitly assumes hierarchical syntactic structure and found that the CFG model performs best in modeling the EEG data. Here, we present an alternative analysis of the data of Brennan & Hale (2019) using on a dependency grammar for hierarchical structure. While phrase structure grammars use structural categories and phrases to model dependencies between words, dependency models use words and functional relationships (e.g. subject or object). One advantage of dependency models lies in their ability to model cross-linguistic variability in word category membership: a word is interpreted as a verb, when it has an "subject" and an "object" parameter, while it is interpreted as a noun when it is a (e.g. object or subject) parameter to a verb. Moreover, transition-based or shift-reduce dependency parsers provide a convenient model of incremental sentence processing, with every shift operation correlating with an increase in memory load and every reduce operation the construction of a single complex entity (comparable to "unification" or "composition" in some accounts) and a corresponding decrease in memory load and overall different feature profile (cf. Lewis, Vasishth,and Van Dyke, 2006). In addition to online measures about whether each new token results in a shift or a reduce, there are also offline measures such as the number of arcs connecting a particular token to preceding elements (nleft) or succeeding elements (nright) and the overall depth of a token in the final parse tree. EEG data were preprocessed similarly to Brennan & Hale, likewise regression timecourses were computed samplewise at each electrode with covariates for word frequency of the current, previous and next word; position of a word within the current sentence; and position of the sentence within the overall story. In contrast to B&Hale, we performed all modelling hierarchically, i.e. by using mixed-effects models on the combined data from all subjects at each electrode, and inserted all predictors simultaneously. We found effects for nleft, which we interpret as reflect the dynamics of memory load: the positive effect of nleft prestimulus corresponds to the increasing memory load over time, while the large negativity beak between 400 and 600ms corresponds to the processing associated with changing representation in memory. We also found a weaker effect for nright at roughly 400ms, which may represent predictive processes. Thus, our analysis demonstrates the feasibility of dependency models of natural language syntax as models of syntactic structure building during the comprehension of naturalistic, everyday language stimuli without the strong (empirically difficult to test) assumptions of phrase structure grammar.

Themes: Computational Approaches, Syntax
Method: Electrophysiology (MEG/EEG/ECOG)

Poster B68

Back