IEEJ Transactions on Electronics, Information and Systems
Online ISSN : 1348-8155
Print ISSN : 0385-4221
ISSN-L : 0385-4221
Special Issue Paper
Informing a Robot of Object Location with Both Hand-Gesture and Verbal Cues
Tomoyasu MizunoYoshinori TakeuchiHiroaki KudoTetsuya MatsumotoNoboru OhnishiTsuyoshi Yamamura
Author information

2003 Volume 123 Issue 12 Pages 2142-2151


Recently, many kinds of robots are developed, and there are a lot of robots which work in human living space. One of the most important interactions between a robot and human is when a human informs a robot of an object’s location. The purpose of this work is to make an interface for informing a robot of object location in a human living space with several objects. We assume that the robot has found a user by sound source localization. At the beginning, the robot recognizes pointing gesture and verbal cues of the user, and detects candidates of object location. The system recognizes pointing direction by a stereo camera, and recognizes verbal cues. The direction of the pointing gesture and the directive word are used to restrict the searching space. When multiple object candidates are detected, the system asks the user for additional features such as color name or relative location among those, and then finds one of them. We have conducted experiments on a dialog task. There were three objects in the searching space. The system is able specify the object by dialog, after which, the robot moves toward it.

Information related to the author
© 2003 by the Institute of Electrical Engineers of Japan
Previous article Next article