Hand gestures combine with speech to form a single integrated system of meaning during language comprehension (Kelly et al., 2010). However, it is unknown whether gesture is uniquely integrated with speech or is processed like any other manual action. Thirty-one participants watched videos presenting speech with gestures or manual actions on objects. The relationship between the speech and gesture/action was either complementary (e.g., “He found the answer,” while producing a calculating gesture vs. actually using a calculator) or incongruent (e.g., the same sentence paired with the incongruent gesture/action of stirring with a spoon). Participants watched the video (prime) and then responded to a written word (target) that was or was not spoken in the video prime (e.g., “found” or “cut”). ERPs were taken to the primes (time-locked to the spoken verb, e.g., “found”) and the written targets. For primes, there was a larger frontal N400 (semantic processing) to incongruent vs. congruent items for the gesture, but not action, condition. For targets, the P2 (phonemic processing) was smaller for target words following congruent vs. incongruent gesture, but not action, primes. These findings suggest that hand gestures are integrated with speech in a privileged fashion compared to manual actions on objects.
Skip Nav Destination
Article navigation
April 2012
Meeting abstract. No PDF available.
April 01 2012
The communicative influence of gesture and action during speech comprehension: gestures have the upper hand
Spencer Kelly;
Spencer Kelly
Colgate University, skelly@colgate.edu
Search for other works by this author on:
Meghan Healey;
Meghan Healey
National Institutes of Health
Search for other works by this author on:
Asli Ozyurek;
Asli Ozyurek
Max Planck Institute for Psycholinguistics
Search for other works by this author on:
Judith Holler
Judith Holler
Max Planck Institute for Psycholinguistics, University of Manchester
Search for other works by this author on:
J. Acoust. Soc. Am. 131, 3311 (2012)
Citation
Spencer Kelly, Meghan Healey, Asli Ozyurek, Judith Holler; The communicative influence of gesture and action during speech comprehension: gestures have the upper hand. J. Acoust. Soc. Am. 1 April 2012; 131 (4_Supplement): 3311. https://doi.org/10.1121/1.4708385
Download citation file:
Citing articles via
Vowel signatures in emotional interjections and nonlinguistic vocalizations expressing pain, disgust, and joy across languages
Maïa Ponsonnet, Christophe Coupé, et al.
The alveolar trill is perceived as jagged/rough by speakers of different languages
Aleksandra Ćwiek, Rémi Anselme, et al.
A survey of sound source localization with deep learning methods
Pierre-Amaury Grumiaux, Srđan Kitić, et al.
Related Content
Cortical processing of audiovisual speech perception in infancy and adulthood
J Acoust Soc Am (November 2013)
Effects of hand gesture and lip movements on auditory learning of second language speech sounds
J Acoust Soc Am (May 2008)
The effect of context priming on the auditory potential evoked by semantic analysis (N400).
J Acoust Soc Am (October 2009)
Cross-modal association between auditory and visual-spatial information in Mandarin tone perception
J Acoust Soc Am (October 2016)
Auditory neural tracking and lexical processing of speech in noise: Masker type, spatial location, and language experience
J. Acoust. Soc. Am. (July 2020)