The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec)
Online ISSN : 2424-3124
2008
Session ID : 2P2-C13
Conference information
2P2-C13 3D Environment Modeling from an Omni-directional Image Sequence
Ryousuke KAWANISHIAtsushi YAMASHITAToru KANEKO
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract
A 3D environment model is important for tasks of autonomous mobile robots. In this paper, we propose a method for 3D environment modeling based on structure from motion using an omni-directional camera installed on a mobile robot. The method extracts and tracks feature points in an omni-directional image sequence to get corresponding points in image pairs taken during the movement of a robot. The relative relations of camera positions and orientations are estimated by the positions of the corresponding points. With these relations and image coordinates of the feature points, 3D coordinates of these points are calculated by triangulation. The individual measurements are integrated by scale matching. By using 3D Delaunay triangulation, triangular meshes is constructed. The method makes the triangular meshes compatible with the physical shape from texture information of images. By texture-mapping, a 3D environment model is generated. Experimental results showed the effectiveness of the proposed method.
Content from these authors
© 2008 The Japan Society of Mechanical Engineers
Previous article Next article
feedback
Top