Abstract
Much research has been done worldwide on support systems for visually impaired persons. There are still many problems for representing real-time information that changes around a user. In this paper, we intro-duce a visual support system that presents a three-dimensional visual information using three-dimensional virtual sound. Three-dimensional information is obtained from analyzing images captured by small stereo cameras, and objects which interest the user are recognized. The user can hear the three-dimensional virtual sounds, which correspond to the position and movement of objects using Head Related Transfer Functions (HRTFs). The user's auditory sense is not spoiled using a bone conduction headset, which dose not block environmental sound. The proposed system is useful in places where the infrastructure is incomplete, and when the situation changes in real-time. We plan to use it, for example, for walking assistance and sports. As experimental results, we found that there are many back and front recognition mistakes, and active operation is needed to recognize the sound position more correctly.