Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
36th (2022)
Session ID : 3F3-GS-9-01
Conference information

Development of Non-Contact User Interface by Hand Gesture Recognition Using Deep Learning
*Daiki ISHIGUROTomoko OZEKI
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

In this study, we develop a non-contact user interface that can recognize hand gestures from input images obtained by a monocular RGB camera and operate a web application. We aim to realize a gesture manipulation system that can be easily operated on general-purpose mobile devices by using only a monocular RGB camera, while there are prior examples of UIs that use infrared sensors and motion sensors to operate objects without touching the operating device. First, we collect the coordinate points of each joint of the hand detected by MediaPipe, a machine learning library, as training data, and classify them into several gestures by deep learning. In addition, it manipulates the map displayed on the application in a way that reflects the recognized hand gestures. We also trained the system using different networks, such as MLP, CNN, and LSTM, to verify the accuracy of each network and to select the most suitable network. We have achieved 94% accuracy of gesture recognition by LSTM and built a NUI system which is available in mobile devices.

Content from these authors
© 2022 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top