Audio-visual [McGurk and MacDonald (1976). Nature 264, 746–748] and audio-tactile [Gick and Derrick (2009). Nature 462(7272), 502–504] speech stimuli enhance speech perception over audio stimuli alone. In addition, multimodal speech stimuli form an asymmetric window of integration that is consistent with the relative speeds of the various signals [Munhall, Gribble, Sacco, and Ward (1996). Percept. Psychophys. 58(3), 351–362; Gick, Ikegami, and Derrick (2010). J. Acoust. Soc. Am. 128(5), EL342–EL346]. In this experiment, participants were presented video of faces producing /pa/ and /ba/ syllables, both alone and with air puffs occurring synchronously and at different timings up to 300 ms before and after the stop release. Perceivers were asked to identify the syllable they perceived, and were more likely to respond that they perceived /pa/ when air puffs were present, with asymmetrical preference for puffs following the video signal—consistent with the relative speeds of visual and air puff signals. The results demonstrate that visual-tactile integration of speech perception occurs much as it does with audio-visual and audio-tactile stimuli. This finding contributes to the understanding of multimodal speech perception, lending support to the idea that speech is not perceived as an audio signal that is supplemented by information from other modes, but rather that primitives of speech perception are, in principle, modality neutral.
Skip Nav Destination
Article navigation
November 2016
November 09 2016
Visual-tactile integration in speech perception: Evidence for modality neutral speech primitives
Katie Bicevskis;
Katie Bicevskis
Department of Linguistics,
University of British Columbia
, Totem Field Studios (Main Department), 2613 West Mall, Vancouver, British Columbia V6T 1Z4, Canada
Search for other works by this author on:
Donald Derrick;
Donald Derrick
a)
University of Canterbury
, NZILBB, Private Bag 4800, Christchurch 8140, New Zealand
Search for other works by this author on:
Bryan Gick
Bryan Gick
b)
Department of Linguistics,
University of British Columbia
, Totem Field Studios (Main Department), 2613 West Mall, Vancouver, British Columbia V6T 1Z4, Canada
Search for other works by this author on:
a)
Electronic mail: donald.derrick@gmail.com
b)
Also at: Haskins Laboratories, New Haven, CT 06511, USA.
J. Acoust. Soc. Am. 140, 3531–3539 (2016)
Article history
Received:
February 12 2016
Accepted:
October 10 2016
Citation
Katie Bicevskis, Donald Derrick, Bryan Gick; Visual-tactile integration in speech perception: Evidence for modality neutral speech primitives. J. Acoust. Soc. Am. 1 November 2016; 140 (5): 3531–3539. https://doi.org/10.1121/1.4965968
Download citation file:
Sign in
Don't already have an account? Register
Sign In
You could not be signed in. Please check your credentials and make sure you have an active account and try again.
Sign in via your Institution
Sign in via your InstitutionPay-Per-View Access
$40.00
Citing articles via
Related Content
Tri-modal speech: Audio-visual-tactile integration in speech perception
J. Acoust. Soc. Am. (November 2019)
The temporal window of audio-tactile integration in speech perception
J. Acoust. Soc. Am. (November 2010)
Aero-tactile integration during speech perception: Effect of response and stimulus characteristics on syllable identification
J. Acoust. Soc. Am. (September 2019)
Digestibility In Vivo and In Sacco of four local grass resource in South Sulawesi Indonesia
AIP Conference Proceedings (June 2023)
Testing symmetry of temporal window of integration between vibrotactile and auditory speech information in voiced phoneme perception
J Acoust Soc Am (September 2018)