Journal of Robotics and Mechatronics
Online ISSN : 1883-8049
Print ISSN : 0915-3942
ISSN-L : 0915-3942
Special Issue on Industrial Robotics and Systems
Gesture Interface with Pointing Direction Classification by Deep Learning Based on RGB Image of Fovea Camera
Takahiro IkedaTsubasa ImamuraSatoshi UekiHironao Yamada
Author information
JOURNAL OPEN ACCESS

2025 Volume 37 Issue 2 Pages 466-477

Details
Abstract

This paper describes a gesture interface for the operation of autonomous mobile robots (AMRs) for transportation in industrial factories. The proposed gesture interface recognizes pointing directions by human operators, who are workers in the factory, based on deep learning using images captured by a fovea-lens camera. The interface could classify pointing gestures into seven directions with a recognition accuracy of 0.89. This paper also introduces the navigation method for AMR to implement the proposed interface. This navigation method enabled the AMR to approach the pointed target by adjusting its horizontal angle based on the object recognition using RGB images. The AMR achieved high position accuracy with a mean position error of 0.052 m by implementing the proposed gesture interface and the navigation method.

Content from these authors

This article cannot obtain the latest cited-by information.

© 2025 Fuji Technology Press Ltd.

This article is licensed under a Creative Commons [Attribution-NoDerivatives 4.0 International] license (https://creativecommons.org/licenses/by-nd/4.0/).
The journal is fully Open Access under Creative Commons licenses and all articles are free to access at JRM official website.
https://www.fujipress.jp/jrobomech/rb-about/#https://creativecommons.org/licenses/by-nd
Previous article Next article
feedback
Top