Journal of Robotics and Mechatronics
Online ISSN : 1883-8049
Print ISSN : 0915-3942
ISSN-L : 0915-3942
Special Issue on Field Robotics with Vision Systems
A Novel Method for Goal Recognition from 10 m Distance Using Deep Learning in CanSat
Miho AkiyamaTakuya Saito
Author information
JOURNAL OPEN ACCESS

2021 Volume 33 Issue 6 Pages 1359-1372

Details
Abstract

In this study, we propose a method for CanSat to recognize and guide a goal using deep learning image classification even 10 m away from the goal, and describe the results of demonstrative evaluation to confirm the effectiveness of the method. We applied deep learning image classification to goal recognition in CanSat for the first time at ARLISS 2019, and succeeded in guiding it almost all the way to the goal in all three races, winning the first place as overall winner. However, the conventional method has a drawback in that the goal recognition rate drops significantly when the CanSat is more than 6–7 m away from the goal, making it difficult to guide the CanSat to the goal when it moves away from the goal because of various factors. To enable goal recognition from a distance of 10 m from the goal, we investigated the number of horizontal regions of interest divisions and the method of vertical shifts during image recognition, and clarified the effective number of divisions and recognition rate using experiments. Although object detection is commonly used to detect the position of an object from an image by deep learning, we confirmed that the proposed method has a higher recognition rate at long distances and a shorter computation time than SSD MobileNet V1. In addition, we participated in the CanSat contest ACTS 2020 to evaluate the effectiveness of the proposed method and achieved the zero-distance goal in all three competitions, demonstrating its effectiveness by winning first place in the comeback category.

Content from these authors

This article cannot obtain the latest cited-by information.

© 2021 Fuji Technology Press Ltd.

This article is licensed under a Creative Commons [Attribution-NoDerivatives 4.0 International] license (https://creativecommons.org/licenses/by-nd/4.0/).
The journal is fully Open Access under Creative Commons licenses and all articles are free to access at JRM official website.
https://www.fujipress.jp/jrobomech/rb-about/#https://creativecommons.org/licenses/by-nd
Previous article Next article
feedback
Top