Speech perception is a multi-sensory experience. Visual information enhances [Sumby and Pollack (1954). J. Acoust. Soc. Am. 25, 212–215] and interferes [McGurk and MacDonald (1976). Nature 264, 746–748] with speech perception. Similarly, tactile information, transmitted by puffs of air arriving at the skin and aligned with speech audio, alters [Gick and Derrick (2009). Nature 462, 502–504] auditory speech perception in noise. It has also been shown that aero-tactile information influences visual speech perception when an auditory signal is absent [Derrick, Bicevskis, and Gick (2019a). Front. Commun. Lang. Sci. 3(61), 1–11]. However, researchers have not yet identified the combined influence of aero-tactile, visual, and auditory information on speech perception. The effects of matching and mismatching visual and tactile speech on two-way forced-choice auditory syllable-in-noise classification tasks were tested. The results showed that both visual and tactile information altered the signal-to-noise threshold for accurate identification of auditory signals. Similar to previous studies, the visual component has a strong influence on auditory syllable-in-noise identification, as evidenced by a 28.04 dB improvement in SNR between matching and mismatching visual stimulus presentations. In comparison, the tactile component had a small influence resulting in a 1.58 dB SNR match-mismatch range. The effects of both the audio and tactile information were shown to be additive.
Skip Nav Destination
Article navigation
November 2019
November 25 2019
Tri-modal speech: Audio-visual-tactile integration in speech perception
Donald Derrick;
Donald Derrick
a)
1
New Zealand Institute of Language, Brain, and Behaviour, University of Canterbury
, 20 Kirkwood Avenue, Upper Riccarton, Christchurch 8041, New Zealand
Search for other works by this author on:
Doreen Hansmann;
Doreen Hansmann
2
School of Psychology, Speech and Hearing, University of Canterbury
, 20 Kirkwood Avenue, Upper Riccarton, Christchurch 8041, New Zealand
Search for other works by this author on:
Catherine Theys
Catherine Theys
2
School of Psychology, Speech and Hearing, University of Canterbury
, 20 Kirkwood Avenue, Upper Riccarton, Christchurch 8041, New Zealand
Search for other works by this author on:
a)
Electronic mail: donald.derrick@canterbury.ac.nz
J. Acoust. Soc. Am. 146, 3495–3504 (2019)
Article history
Received:
May 20 2019
Accepted:
October 25 2019
Citation
Donald Derrick, Doreen Hansmann, Catherine Theys; Tri-modal speech: Audio-visual-tactile integration in speech perception. J. Acoust. Soc. Am. 1 November 2019; 146 (5): 3495–3504. https://doi.org/10.1121/1.5134064
Download citation file:
Sign in
Don't already have an account? Register
Sign In
You could not be signed in. Please check your credentials and make sure you have an active account and try again.
Pay-Per-View Access
$40.00
Citing articles via
A survey of sound source localization with deep learning methods
Pierre-Amaury Grumiaux, Srđan Kitić, et al.
Co-speech head nods are used to enhance prosodic prominence at different levels of narrow focus in French
Christopher Carignan, Núria Esteve-Gibert, et al.
Source and propagation modelling scenarios for environmental impact assessment: Model verification
Michael A. Ainslie, Robert M. Laws, et al.
Related Content
Aero-tactile integration during speech perception: Effect of response and stimulus characteristics on syllable identification
J. Acoust. Soc. Am. (September 2019)
Visual-tactile integration in speech perception: Evidence for modality neutral speech primitives
J. Acoust. Soc. Am. (November 2016)
Effects of aero-tactile stimuli on continuous speech perception
J Acoust Soc Am (October 2016)
Spatial congruence in multimodal speech perception
J Acoust Soc Am (October 2016)
The effect of native language and bilingualism on multimodal perception in speech: A study of audio-aerotactile integration
J. Acoust. Soc. Am. (March 2024)