Host: The Japan Society of Mechanical Engineers
Name : [in Japanese]
Date : May 10, 2017 - May 13, 2017
This article reports the development of a visual odometry for estimating 3D trajectory of a rescue dog moving in a disaster scene from a camera mounted on its back in an online manner. The major difficulty here is that camera shake occurs consistently while a dog moves (i.e, runs or even just walks), incurring significant changes in consecutive video frames. This makes it hard to employ standard visual SLAM/odometry systems. To cope with this, we use an omnidirectional camera and match feature points globally between consecutive video frames to makes the matches as accurate as possible. To achieve real-time computation, we implement the system using multiple threads on CPU and GPU. We conducted experiments in the experimental field of ImPACT TRC and confirmed the effectiveness of our approach.