Slide Slam

< Slide Slam Sessions

Slide Slam H13

Infants Show Increased Neural Tracking of Intonation during Natural Infant-Directed Speech

Slide Slam Session H, Wednesday, October 6, 2021, 6:00 - 8:00 am PDT Log In to set Timezone

Katharina Menn1, Christine Michel1,2, Lars Meyer1, Stefanie Hoehl3, Claudia Männel1,4; 1Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany, 2Leipzig University, Germany, 3University of Vienna, Austria, 4Charité – University Medicine Berlin, Germany

Infants are social individuals and engage in interactions long before they can speak (Bell, 1974). When interacting with infants, adults use a characteristic register, termed infant-directed speech (IDS) (Soderstrom, 2007). Infants prefer IDS over adult-directed speech (ADS) (Cooper & Aslin, 1990); moreover, IDS assists infants’ word segmentation and recognition (Schreiner & Mani, 2017; Singh et al., 2009). This IDS benefit has been argued to reflect enhanced amplitude modulations at the frequency of intonation (< 3 Hz; Leong et al., 2017), which is critical for word segmentation (Goswami, 2019). While IDS is known to benefit the electrophysiological tracking of speech by infants (Kalashnikova et al., 2018), it remains unclear whether this results specifically from intonation or other factors, such as the syllabic rhythm. To test this, we compared infants’ tracking of IDS and ADS at both the intonation rate (1–3 Hz) and the syllable rate (3.3–8.3 Hz). In parent-infant dyads (n = 30), parents described novel objects to their 9-month-olds while infants’ EEG was recorded. For IDS, parents were instructed to talk to infants as they would typically do, while for ADS, parents were supposed to describe the objects to an adult. Acoustic modulations were enhanced in IDS (all p < .005). Cortical tracking of speech was assessed by speech–brain coherence, which measures the synchronization between the EEG and the speech envelope. Higher synchronization between neural activity and speech supports speech processing (for review, see Meyer, 2018). We expected higher speech–brain coherence at the syllabic and intonation rates for IDS compared to ADS, indicating increased neural tracking of slow amplitude modulations. Our analyses revealed significant speech–brain coherence at both syllabic and intonation rates (both p < 0.001), indicating that infants track speech during natural interactions. In addition, we found significantly higher speech–brain coherence for IDS as compared to ADS at the intonation rate (p = .01), but not the syllabic rate (p = .31)—indicating that the IDS benefit arises primarily from enhanced intonation. Thus, neural tracking is sensitive to parents’ speech adaptations during natural interactions, probably facilitating higher-level inferential processes such as word segmentation. This makes neural tracking a potential neural mechanism for infants’ word segmentation from continuous speech.

< Slide Slam Sessions

SNL Account Login

Forgot Password?
Create an Account

News