2015 年 27 巻 4 号 p. 143-147
In this study, an interface with which a human can obtain information on remote circumstances from a mobile robot using eye-gaze input is considered. With this system, information on the remote circumstances is acquired by a camera installed on the mobile robot, and an image is displayed to a user. The position of human attention in the image is recognized by the eye-gaze input system, and the orientation of the robot and camera is adjusted based on the recognition so that the position of human attention is at the center of the image, which means that the user can recognize his/her target easily. In the eye-gaze input system, particle filter processing is applied to detect the movement of the dark part of an eye (iris and pupil) to improve recognition by the system. The desirable movement speed of the image is examined through psychological evaluations while the orientation of the camera and the mobile robot is adjusted. The results show that desirable speeds exist for pitch and yaw angles. In addition, human experiments were conducted to recognize an object in remote circumstances using the proposed method and speeds. The results show that the trajectory of eye-gaze movement becomes smooth and the user can recognize the object easily, which shows the effectiveness of the proposed method.