2021 年 26 巻 4 号 p. 277-287
Wind displays are devices that enhance the presence of virtual reality (VR) by simulating realistic sensation of wind. However, certain wind displays require many wind sources because they reproduce fully physical wind directions. Methods to manipulate perceived directions of wind by cross-modal effects and realize wind displays with fewer wind sources have been proposed so far. We hypothesized that the visuoaudio-haptic cross-modal effect can enhance the effect of the manipulation. Thus we quantitatively investigated the effect of the congruent visual and audio information on the perceived wind directions. We present virtual images of flowing particles and threedimensional sounds of the wind as information to indicate “virtual” wind directions. The user study has shown that presenting both visual and audio information is significantly effective than presenting the audio or visual information alone. The median of absolute errors between the perceived wind directions and the virtual wind directions were 34.8° at most with the visual and audio information and the physical wind from front or behind.