The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec)
Online ISSN : 2424-3124
2007
Session ID : 1A2-N07
Conference information
1A2-N07 A robot that can be directed in a simplified spoken language and by tactile interaction
Tetsushi OKATakashi SASAKI
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract
This paper demonstrates a prototype test bed for usability studies of a multi-modal language for directing robots, using a Sony AIBO and JULIAN, a grammar-based speech recognition engine. We developed a grammar for spoken commands based on Japanese in order to simplify command understanding and an event-driven system to interpret multi-modal directions. In our pilot studies, we confirmed our robot can properly react to spoken, non-verbal and multi-modal directions by humans. Results from recent experiments involving subjects unfamiliar with robots are also reported and discussed.
Content from these authors
© 2007 The Japan Society of Mechanical Engineers
Previous article Next article
feedback
Top