In this study, we propose an interaction method that links sound texture representation and sound image localization using granular synthesis in order to realize spatial acoustic feedback in response to body movements. By using a single sound file containing multiple environmental sounds arranged in chronological order and controlling the playback position according to hand position information, we have realized an experience in which the user can explore sound textures as if moving in a virtual space. The proposed system was implemented in two different exhibition conditions, a 15-channel loudspeaker environment and a 4-channel quadraphonic environment, and positive responses were confirmed in terms of the synchronization of sound texture and motion, as well as the spatial extent, through the comments obtained from the users. Even with a configuration using a small number of loudspeakers, the sound image tracking and texture changes in response to hand movements were perceived, suggesting the flexibility and versatility of this method. In addition, the playback position control corresponding to the vertical direction was effective in designing interactions that promote spatial cognition. This study presents new potential applications of spatial acoustics based on the connection between body movements and sound, and suggests future developments in multisensory interaction design in VR/AR environments.
View full abstract