Infants are able to match seen and heard speech even in non-native languages, and familiarization to audiovisual speech appears to affect subsequent auditory-only discrimination of non-native speech sounds (Danielson et al., 2013; 2014). However, the robustness of these behaviors appears to change rapidly within the first year of life. In this current set of studies, conducted with six-, nine-, and 10-month-old English-learning infants, we examine the developmental trajectory of audiovisual speech perception of non-native speech sounds. In the first place, we show that the tendency to detect a mismatch between heard and seen speech sounds in a non-native language changes across this short period in development, in tandem with the trajectory of auditory perceptual narrowing (Werker & Tees, 1984; Kuhl et al., 1992; inter alia). Furthermore, we demonstrate that infants’ familiarization to matching and mismatching audiovisual speech affects their auditory speech perception differently at various ages. While six-month-old infants’ auditory speech perception appears to be malleable in the face of prior audiovisual familiarization, this tendency declines with age. The current set of studies is one of the first to utilize traditional looking-time measurements while also employing pupillometry as a correlate of infants’ acoustic change detection (Hochmann & Papeo, 2014).
Skip Nav Destination
Article navigation
April 2015
Meeting abstract. No PDF available.
April 01 2015
The use of visual information in non-native speech sound discrimination across the first year of life
D. Kyle Danielson;
D. Kyle Danielson
Psych., Univ. of Br. Columbia, 2136 West Mall, Vancouver, British Columbia V6T 1Z4, Canada, kdanielson@psych.ubc.ca
Search for other works by this author on:
Padmapriya A. Kandhadai;
Padmapriya A. Kandhadai
Psych., Univ. of Br. Columbia, 2136 West Mall, Vancouver, British Columbia V6T 1Z4, Canada, kdanielson@psych.ubc.ca
Search for other works by this author on:
Janet F. Werker
Janet F. Werker
Psych., Univ. of Br. Columbia, 2136 West Mall, Vancouver, British Columbia V6T 1Z4, Canada, kdanielson@psych.ubc.ca
Search for other works by this author on:
J. Acoust. Soc. Am. 137, 2432 (2015)
Citation
D. Kyle Danielson, Padmapriya A. Kandhadai, Janet F. Werker; The use of visual information in non-native speech sound discrimination across the first year of life. J. Acoust. Soc. Am. 1 April 2015; 137 (4_Supplement): 2432. https://doi.org/10.1121/1.4920876
Download citation file:
Citing articles via
Vowel signatures in emotional interjections and nonlinguistic vocalizations expressing pain, disgust, and joy across languages
Maïa Ponsonnet, Christophe Coupé, et al.
The alveolar trill is perceived as jagged/rough by speakers of different languages
Aleksandra Ćwiek, Rémi Anselme, et al.
A survey of sound source localization with deep learning methods
Pierre-Amaury Grumiaux, Srđan Kitić, et al.
Related Content
Experience-independent effects of matching and non-matching visual information on speech perception
J Acoust Soc Am (October 2014)
Visual and sensori-motor influences on speech perception in infancy
J Acoust Soc Am (November 2013)
Effects of acoustic transformation on cross‐modal speech information and audiovisual gain II.
J Acoust Soc Am (October 2008)
Face masks, speech intelligibility, and listening effort
J Acoust Soc Am (October 2021)
Anchoring effects in audiovisual speech perception
J Acoust Soc Am (June 2002)