Host: The Japan Society of Mechanical Engineers
Name : [in Japanese]
Date : June 01, 2022 - June 04, 2022
Hazard detection, simultaneous localization and mapping (SLAM), and terrain classification are crucial for autonomous navigation of the Moon or remote planets. In this paper, we integrate a 360-degree laser sensor (LIDAR) and an omni-directional camera in the co-axial arrangement to achieve the above three goals. The point-cloud information from the LIDAR is used to detect hazardous objects, create a high-precision map of the surrounding environment, and then localize the rover. The visual images from the omni-directional camera are used to add semantic information of the detected objects through the mask recurrent convolutional neural network (Mask R-CNN) technique. The performance of the integrated system was evaluated in a simulated planetary field.