Neuroscience shows that auditory neurons extract a variety of specific signal parameters such as frequency, amplitude modulation, frequency modulation, and onsets. Furthermore, biology gives us examples of auditory foveae in bats and owls where there is overrepresentation of behaviorally relevant signal parameters. This shows that biology finds nuances of these parameters to be more important than compact coding considerations such as orthonormal basis functions. In contrast to typical spectrogram approaches where a specific time window is chosen and where the Fast Fourier Transform provides non-overlapping orthogonal rectangular tilings, we start with the smallest tiling possible. The uncertainty principle gives us this tiling as the Gaussian modulated sinusoid. We can rotate this tiling in the time-frequency plane, which gives us a chirplet. We then construct many chirplets (a chirplet set) centered at the same time-frequency point. We then find the chirplet that best matches the signal at this point. Our ability to sample at arbitrary locations in the time-frequency plane with different chirplet sets gives us the ability to create customizable artificial auditory foveae. We use this chirplet front-end for a classification task of identifying Marmoset vocalizations and comparisons to typical spectrogram methods for audio classification are made.
Skip Nav Destination
Article navigation
March 2018
Meeting abstract. No PDF available.
March 01 2018
A customizable artificial auditory fovea
Christopher N. Casebeer;
Christopher N. Casebeer
Elec. and Comput. Eng., Montana State Univ., Cobleigh 541, Bozeman, MT 59715, [email protected]
Search for other works by this author on:
Ross K. Snider
Ross K. Snider
Elec. and Comput. Eng., Montana State Univ., Cobleigh 541, Bozeman, MT 59715, [email protected]
Search for other works by this author on:
J. Acoust. Soc. Am. 143, 1825 (2018)
Citation
Christopher N. Casebeer, Ross K. Snider; A customizable artificial auditory fovea. J. Acoust. Soc. Am. 1 March 2018; 143 (3_Supplement): 1825. https://doi.org/10.1121/1.5035995
Download citation file:
38
Views
Citing articles via
All we know about anechoic chambers
Michael Vorländer
A survey of sound source localization with deep learning methods
Pierre-Amaury Grumiaux, Srđan Kitić, et al.
Does sound symbolism need sound?: The role of articulatory movement in detecting iconicity between sound and meaning
Mutsumi Imai, Sotaro Kita, et al.
Related Content
Ears adapted for the detection of motion, or how echolocating bats have exploited the capacities of the mammalian auditory system
J Acoust Soc Am (September 1980)
Model-based analysis of dispersion curves using chirplets
J. Acoust. Soc. Am. (April 2006)
Acoustic analysis of vocal development in a New World primate, the common marmoset (Callithrix jacchus)
J. Acoust. Soc. Am. (September 2006)
Different forms of auditory-vocal feedback control in echolocating bats
J Acoust Soc Am (November 2013)
Sonar signal representation mimicking the inner ear of horseshoe bats
J Acoust Soc Am (March 2019)