Transactions of the JSME (in Japanese)
Online ISSN : 2187-9761
ISSN-L : 2187-9761

This article has now been updated. Please use the final version.

Real-time self-attitude estimation using visual images and/or structures of the environments
Ryota OZAKIYoji KURODA
Author information
JOURNAL FREE ACCESS Advance online publication

Article ID: 21-00098

Details
Abstract

This paper presents a real-time self-attitude estimation method which utilizes the clues to the direction of the gravity hidden in images and structures of the environments. In the proposed method, the angular velocity is integrated using a gyroscope, a camera-based method estimates the gravity direction, and a LiDAR-based one also estimates the gravity direction, respectively. These estimations are integrated using the EKF (extended Kalman filter). The camera-based gravity direction estimation uses a DNN (deep neural network) which learns the regularity between the gravity direction and the landscape information. By learning the regularity, the proposed DNN can infer the gravity direction from only a single shot image. The DNN outputs the mean and variance to express uncertainty of the inference. The LiDAR-based gravity direction estimation extracts vertical planes from the surrounding environment measured by the LiDAR, and outputs the gravity direction based on their normals. By using both the camera and the LiDAR, more robust and accurate estimation can be achieved. To show the DNN can estimate the direction of gravity with uncertainty expression, static validations on test datasets are performed. Dynamic validations are also performed to show the proposed EKF-based method can estimate the attitude in real time. These validations are performed in both simulator and real world to compare the proposed method with conventional methods.

Content from these authors
© 2021 The Japan Society of Mechanical Engineers
feedback
Top