Having easily accessible and accurate directivity patterns of sound sources is valuable for many applications, including architectural acoustics modeling and spatial audio. To provide this information to other researchers, the authors of this presentation are creating an online directivity database for live human speech, played musical instruments, and other sources of sound. The results are derived from recordings of the sources over their useful bandwidths at 2522 unique microphone positions over a surrounding sphere (i.e., with 5-deg resolution in both the polar and azimuthal angles). Processing of the recordings has led to frequency-dependent spherical-harmonic expansions. The expansion coefficients, as well as broad-band tabulated attenuation results (commonly used in architectural acoustics simulation packages), are freely available in the ASCII format. The database also contains figures and animations of the directivity patterns, allowing for quick visualization. The collections should help improve modeling of various acoustic spaces, microphone placements for recordings, and general understanding of source radiation characteristics.
Skip Nav Destination
Article navigation
October 2019
Meeting abstract. No PDF available.
October 01 2019
An archival database of high-resolution directivities
Rachel C. Edelman;
Rachel C. Edelman
Phys. and Astronomy, Brigham Young Univ., 475 w 1720 n, Apt 1-303, Provo, UT 84604, [email protected]
Search for other works by this author on:
Timothy W. Leishman
Timothy W. Leishman
Phys. and Astronomy, Brigham Young Univ., Provo, UT
Search for other works by this author on:
J. Acoust. Soc. Am. 146, 2803 (2019)
Citation
Rachel C. Edelman, Samuel Bellows, Timothy W. Leishman; An archival database of high-resolution directivities. J. Acoust. Soc. Am. 1 October 2019; 146 (4_Supplement): 2803. https://doi.org/10.1121/1.5136709
Download citation file:
96
Views
Citing articles via
A survey of sound source localization with deep learning methods
Pierre-Amaury Grumiaux, Srđan Kitić, et al.
Rapid detection of fish calls within diverse coral reef soundscapes using a convolutional neural network
Seth McCammon, Nathan Formel, et al.
Related Content
Measured high-resolution directivities of guitar amplifiers
J Acoust Soc Am (April 2022)
High-resolution spherical directivity of live speech from a multiple-capture transfer function method
J. Acoust. Soc. Am. (March 2021)
Towards an efficient archive of spontaneous speech: Design of computer‐assisted speech transcription system
J Acoust Soc Am (November 2006)
Centralized data repositories: NOAA’s National Archives for Marine Acoustic Data
J Acoust Soc Am (April 2022)
Towards a cloud optimized data lake for archived water column sonar data
J. Acoust. Soc. Am. (March 2023)