抄録
Robust self-localization is important for AUVs (Autonomous Underwater Vehicle), especially on missions where long distances are covered, in order to follow given waypoints accurately and also during processing of the obtained data. In order to improve the accuracy of the position estimate we propose a SLAM localization method based on images acquired by a single camera. It compares a topological map created based on features detected in photos acquired from different positions to a terrain map acquired with a different method, such as a light-sectioning. We propose a filtering method to improve feature-based matching between image pairs where the known bathymetry map is used for eliminating erroneous feature matches. At the same time the self-localization is estimated so that there is no contradiction between the motion parallax and terrain measurements. The performance of the proposed method was verified by field and tank experiments.