Acoustical Science and Technology
Online ISSN : 1347-5177
Print ISSN : 1346-3969
ISSN-L : 0369-4232
INVITED PAPERS
Mapping the perceptual topology of auditory space permits the creation of hyperstable virtual acoustic environments
W. Owen BrimijoinShawn FeatherlyPhilip Robinson
Author information
JOURNAL FREE ACCESS

2020 Volume 41 Issue 1 Pages 245-248

Details
Abstract

The perception of acoustic motion is not uniform as a function of azimuth; listeners need roughly twice as much motion at the side than at the front to judge the two motions as equivalent. Self-generated acoustic motion perception has also been shown to be distorted. Sounds moved slightly with the listener's head are more consistently judged to be world-stable than those that are truly static. These distortions can be captured by a model that incorporates a head-centric warping of perceived sound location, characterized by a displacement in apparent sound location away from the acoustic midline. Such a distortion has been demonstrated; listeners tend to overestimate azimuth when they are asked to point at a sound source while keeping their head and eyes fixated ahead of them. Here we show that this mathematical framework may be inverted and we demonstrate the benefits of re-mapping sound source locations toward the auditory midline. We show that listeners prefer different amounts of spatial remapping, but none preferred no remapping. Modelling shows minimal impact on spatial release from masking for small amounts of remapping, demonstrating that it is possible to achieve a more stable perceptual environment without sacrificing speech intelligibility in spatially complex environments.

Content from these authors
© 2020 by The Acoustical Society of Japan
Previous article Next article
feedback
Top