The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec)
Online ISSN : 2424-3124
2023
Session ID : 1A2-E06
Conference information

Prediction of finger motions based on high-density electromyographic signals using two-dimensional convolutional neural networks
*He ChongzaijiaoD.S.V. BandaraHirofumi NogamiJumpei Arata
Author information
CONFERENCE PROCEEDINGS RESTRICTED ACCESS

Details
Abstract

Generating multiple patterns of grasping with different finger arrangements is an important characteristic of human hands during activities of daily living. Wearable robotic hands such as exoskeletons or prostheses, are used to supplement or substitute human hands in the events of rehabilitation, motion assist or amputation, respectively. To control them intuitively, it is important to understand the motion intention of the wearer. This has been a challenge with hand motions due to the availability of higher number of individual degrees of freedoms (DOF) of hand and the limited availability of information related to these motions, non-invasively. However, recently high-density surface electromyography (HDEMG) has proven to be effective in providing enough spatial resolution to access motion related information for different hand motions. Thus, in this study towards controlling a hand prosthesis we estimated the motion intention for 6 different finger motions by using time series data of HDEMG. Initially, HDEMG signals were recorded using a 64-channels electrode grid and using the root mean square values of the preprocessed HDEMG data, a 2D convolution neural network was trained to estimate the 6 selected different finger motions. Results demonstrated the proposed methodology can estimate the motion intention with an average accuracy of 85% and a highest accuracy of 92%.

Content from these authors
© 2023 The Japan Society of Mechanical Engineers
Previous article Next article
feedback
Top