The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec)
Online ISSN : 2424-3124
2015
Session ID : 2A2-L10
Conference information
2A2-L10 Study of localization based on visual experience
Tomoya MURASEKentaro YANAGIHARAKanji TANAKA
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract
In this study, we address the problem the problem of visual robot localization using monocular vision sensor. The goal of visual robot localization is to localized the robot itself with respect to a view sequence map by incorporating sensor measurement from robot's vision sensors. To this end, the robot updates its current belief of self-position every time a new sensor measurement arrives. There are two popular approaches to this problem, Kalman filter and Particle filter approaches. The former is an efficient estimation using a Kalman filter, etc. when the initial robot pose is known. The latter is a robust estimation using a Particle filter, etc. when the initial posture of the robot is unknown. Since the two approaches have their own advantages and drawbacks, we here proposed a unified approach called GMM (Gaussian Mixture Model) filter that combines the advantages of two approaches. Our basic idea is to approximate the inherent multi-model belief distribution efficiently and accurately by using a mixture of Gaussian distributions. The effectiveness of the proposal method is experimentally verified using a real-world robot vision system.
Content from these authors
© 2015 The Japan Society of Mechanical Engineers
Previous article Next article
feedback
Top