In four experiments listeners were rotated or were stationary. Sounds came from a stationary loudspeaker or rotated from loudspeaker to loudspeaker around an azimuth array. When either sounds or listeners rotate the auditory cues used for sound source localization change, but in the everyday world listeners perceive sound rotation only when sounds rotate not when listeners rotate. In the everyday world sound source locations are referenced to positions in the environment (a world-centric reference system). The auditory cues for sound source location indicate locations relative to the head (a head-centric reference system), not locations relative to the world. This paper deals with a general hypothesis that the world-centric location of sound sources requires the auditory system to have information about auditory cues used for sound source location and cues about head position. The use of visual and vestibular information in determining rotating head position in sound rotation perception was investigated. The experiments show that sound rotation perception when sources and listeners rotate was based on acoustic, visual, and, perhaps, vestibular information. The findings are consistent with the general hypotheses and suggest that sound source localization is not based just on acoustics. It is a multisystem process.
Skip Nav Destination
Article navigation
November 2015
November 24 2015
Judging sound rotation when listeners and sounds rotate: Sound source localization is a multisystem process
William A. Yost;
William A. Yost
a)
Speech and Hearing Science,
Arizona State University
, P.O. Box 870102, Tempe, Arizona 85287, USA
Search for other works by this author on:
Xuan Zhong;
Xuan Zhong
Speech and Hearing Science,
Arizona State University
, P.O. Box 870102, Tempe, Arizona 85287, USA
Search for other works by this author on:
Anbar Najam
Anbar Najam
Speech and Hearing Science,
Arizona State University
, P.O. Box 870102, Tempe, Arizona 85287, USA
Search for other works by this author on:
a)
Electronic mail: william.yost@asu.edu
J. Acoust. Soc. Am. 138, 3293–3310 (2015)
Article history
Received:
December 02 2014
Accepted:
October 21 2015
Citation
William A. Yost, Xuan Zhong, Anbar Najam; Judging sound rotation when listeners and sounds rotate: Sound source localization is a multisystem process. J. Acoust. Soc. Am. 1 November 2015; 138 (5): 3293–3310. https://doi.org/10.1121/1.4935091
Download citation file:
Sign in
Don't already have an account? Register
Sign In
You could not be signed in. Please check your credentials and make sure you have an active account and try again.
Pay-Per-View Access
$40.00
Citing articles via
A survey of sound source localization with deep learning methods
Pierre-Amaury Grumiaux, Srđan Kitić, et al.
Co-speech head nods are used to enhance prosodic prominence at different levels of narrow focus in French
Christopher Carignan, Núria Esteve-Gibert, et al.
Source and propagation modelling scenarios for environmental impact assessment: Model verification
Michael A. Ainslie, Robert M. Laws, et al.
Related Content
Sound-source localization as a multisystem process: The Wallach azimuth illusion
J. Acoust. Soc. Am. (July 2019)
Ken Stevens' contributions to the field of communication disorders
J Acoust Soc Am (April 2015)
How many images are in an auditory scene?
J. Acoust. Soc. Am. (April 2017)
The role of spectral detail in the binaural transfer function on perceived externalization in a reverberant environment
J. Acoust. Soc. Am. (May 2016)
Do we need two ears to perceive the distance of a virtual frontal sound source?
J. Acoust. Soc. Am. (September 2020)