The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec)
Online ISSN : 2424-3124
2020
Session ID : 2P2-J01
Conference information

Human Action Classification Using the Motion Detection Model of the Insect Visual System and the Echo State Network
*Akito MoritaHirotsugu Okuno
Author information
CONFERENCE PROCEEDINGS RESTRICTED ACCESS

Details
Abstract

Spatio-temporal information is an important cue for action classification, whereas the computational cost for handling spatio-temporal information in neural networks is very high. In this study, we developed a action classification algorithm that processes spatio-temporal information efficiently by combining the elementary motion detector (EMD), which is a low-computational-cost neuro-inspired motion sensing algorithm, and the echo state network (ESN), which is a three-layered neural network with a recurrent layer. We evaluated the algorithm by classifying six kinds of human action, and compared the accuracy of the proposed algorithm with that without preprocessing of the EMD to clarify the effect of preprocessing. The results showed that the algorithm was successful in classifying the human action and that the accuracy was increased by the EMD drastically.

Content from these authors
© 2020 The Japan Society of Mechanical Engineers
Previous article Next article
feedback
Top