The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec)
Online ISSN : 2424-3124
2021
Session ID : 1P1-H01
Conference information

Extraction of Hand Motions Using First-Person View Video for Teaching Motions to Robots
*Daichi AKATSUKASung-Gwi CHOJun TAKAMATSUTsukasa OGASAWARA
Author information
CONFERENCE PROCEEDINGS RESTRICTED ACCESS

Details
Abstract

Service robots are expected to be used in many situations, such as homes and factories, instead of people. However, it takes a lot of effort to teach the robots how to perform behaviors which human have been doing. To expand the use of service robots, it is necessary to develop the support system for teaching robots; the system should easily transfer the behaviors from human to robots. In this paper, we propose a teaching system by extracting hand-object interaction from first-person view videos acquired by a camera attached to a human’s head. The proposed system generates a sequence of hand motions by extracting three types of simple motion elements (Translate, Rotate, and Grasp) from first-person view video as teaching information to a robot. In the experiment, we confirm the usefulness of this system by actually having a person perform the pouring task as an example, extracting the motion element sequence from the first-person view video, and reconstructing the robot’s pouring motion using it.

Content from these authors
© 2021 The Japan Society of Mechanical Engineers
Previous article Next article
feedback
Top