Transactions of the Japan Society of Mechanical Engineers Series C
Online ISSN : 1884-8354
Print ISSN : 0387-5024
A Study on Robot-Human System with Consideration of Individual Preferences
2nd Report, Multimodal Human-Machine Interface for Object-Handing Robot System
Mitsuru JINDAISatoru SHIBATATomonori YAMAMOTOTomio WATANABE
Author information
JOURNAL FREE ACCESS

2007 Volume 73 Issue 729 Pages 1408-1415

Details
Abstract

In this study, we propose an object-handing robot system with a multimodal human-machine interface which is composed of speech recognition and image processing units. Using this multimodal human-machine interface, the cooperator can order the object-handing robot system using voice commands and hand gestures. In this robot system, the motion parameters of the robot, which are maximum velocity, velocity profile peak and handing position, can be adjusted by the voice commands or the hand gestures in order to realize the most appropriate of the robot. Furthermore, the cooperator can order the handing of objects using voice commands along with hand gestures. In these voice commands, the cooperator can use adverbs. This permits the cooperator to realize efficient adjustments, because the adjustment value of each motion parameters is determined by adverbs. In particular, adjustment values corresponding to adverbs are estimated by fuzuy inference in order to take into consideration the ambiguities of human speech.

Content from these authors
© The Japan Society of Mechanical Engineers
Previous article Next article
feedback
Top