ライフサポート
Online ISSN : 1884-5827
Print ISSN : 1341-9455
ISSN-L : 1341-9455
特集号「速報」
視線入力を用いた移動ロボットからの環境情報取得
柴田 論山本 智規
著者情報
ジャーナル フリー

2015 年 27 巻 4 号 p. 143-147

詳細
抄録

In this study, an interface with which a human can obtain information on remote circumstances from a mobile robot using eye-gaze input is considered. With this system, information on the remote circumstances is acquired by a camera installed on the mobile robot, and an image is displayed to a user. The position of human attention in the image is recognized by the eye-gaze input system, and the orientation of the robot and camera is adjusted based on the recognition so that the position of human attention is at the center of the image, which means that the user can recognize his/her target easily. In the eye-gaze input system, particle filter processing is applied to detect the movement of the dark part of an eye (iris and pupil) to improve recognition by the system. The desirable movement speed of the image is examined through psychological evaluations while the orientation of the camera and the mobile robot is adjusted. The results show that desirable speeds exist for pitch and yaw angles. In addition, human experiments were conducted to recognize an object in remote circumstances using the proposed method and speeds. The results show that the trajectory of eye-gaze movement becomes smooth and the user can recognize the object easily, which shows the effectiveness of the proposed method.

著者関連情報
© 2015, ライフサポート学会
前の記事 次の記事
feedback
Top