Abstract
Virtual reality technology provides an interface between man and computer. To make more real virtual environment, it is essential to investigate the ‘Sensory Motor Coordination.’ In this paper, indicating movements and gazing movements were used as one of the ‘Sensory-Motor Coordination.’ The indicating movements, which were the response of normal subjects to optic and acoustic targets, were measured.
The measuring system was composed by a display unit, an eye and head measuring unit and an indicated point measuring unit. The display unit presents targets in a transverse plane of eyes. 19 targets (LEDs and speakers) were arranged from left to right, with separation of 10 degree. Targets were placed at 50 centimeter distance. Stimulation period of the target was 200ms. This stimulation is the same period as the latency of the saccadic eye movements. The eye and head measuring unit was composed by two instruments. One is EOG (electro-oculography). The other is ‘3SPACE FASTRAK’ (POLHEMUS Co.). Head movements were measured by ‘3SPACE FASTRAK’ (POLHEMUS Co.) that uses electromagnetic search-coil technique. Indicating point measuring unit that used a touch panel measured azimuth and elevation of indicated points. The sampling frequency of this total experiment system is 40Hz. Subjects continuously fixated an LED presented straight ahead (0 deg); proper fixation was monitored by an operator. Targets for pointing were presented on the display in a random sequence at eight positions. In a later data inspection, responses were discarded when the eyes had drifted or saccaded during the arm movement.
From the responses of each movement, A shift and S. D. of indicated points increased according to a target presenting angle. This outcome revealed that the response in the peripheral visual field is underestimated on the ‘Sensory-Motor Coordination.’