2021 Volume 57 Issue 12 Pages 519-525
Generally, wheel odometry accuracy tends to drop significantly in situations such as running on soft ground or when the robot is turning. Visual odometry (VO) is considered a promising alternative for calculating the trajectory of autonomous robots. However, the conventional VO methods have the issues that the accuracy decreases in the environment where in the crowds or have few feature points. In this paper, we propose a real-time and highly accurate VO method that uses a downward-facing monocular camera. The proposed method has advantages in the accuracy and calculation performance by estimating and matching the feature of the ground surface using a lightweight neural network. Besides, the pose of the camera is automatically estimated to prevent the decrease of odometry accuracy due to the error of the camera posture.