Recently, "Virtual Reality"(VR) has become one of the most attractive new technologies. VR isalso knownas"Virtual Environment", or VE. Inthis paper, we usetheterm "VE" because we are not working on training simulators but instead on systems for general users. VE technology has rapidly advanced and offers many promising applications in such areas as training, medicine and so on. However, a potential hazard to users of virtual environments has been found: some users complain of discomfort during and after the experience. This phenomenon is similar to motion sickness and has been called "simulator sickness". There is a direct link between simulator sickness and sickness in virtual environments: both are forms of visually-induced motion sickness. However, we believe there is a significant difference between general-purpose VE and simulators for special training. This report surveys the literature on simulator sickness and motion sickness in relation to discomfort in virtual environments, especially from the point of view of autonomic nervous system (ANS) responses. We hopethis research can suggestways to combat such risksto users and contribute to the expansion of safe VR and VE technologies.
Although modern telecommunication systems have changed our social communication styles so drastically, audibly challenged people cannot take benefits of them based on phonetic media. This paper introduces a new telecommunication system for sign language utilizing virtual reality technology, which solves natural sign conversation on an ultra low bit-rate channel. The system transmits kinematic data of the upper body motions instead of visual images of the body. The avatar which acts as the sender appears on recivers terminal and mediates sign conversation. This avatar-based telecommunication system realizes high frame rate and high resolution on an analogue telephone line. A prototype system, S-TEL, implementing the above idea on UDP/IP, guaranteed an effective communication for sign conversation via a lossy channel, which is superior to a conventional Internet video chat.
Cutting operation is an essential method of forming and designing shapes. We implemented a virtual cutting work space, in which we can cut virtual objects of various shapes through direct-operation interface. The object in the work space was defined as a closed surface that consists of triangle patches. The cutting operation was performed as boolean operations between an object and a cutting surface. Also, we employed a force feedback device and liquid crystal shutter glasses to represent force sensation while cutting and to provide stereo-scopic view on the work space.
This paper introduces an example of multi-user mixed reality application, the AR^2 hockey (Augmented Reality AiR hockey) system. The players share both physical space and virtual space, which are consistently aligned. They shoot a virtual puck with real mallets alternately aiming an opponent goal. In this kind of applications, time lag and spatial error between two spaces are critical. In this system, we have tried to resolve these problems utilising sensor fusion. The configuration of the system is described.