2013 年 133 巻 2 号 p. 356-364
In this research, position and rotation estimation for mobile robots in outside of a recording path is realized by applying ego-motion to view-based navigation. The ego-motion is calculated from differences in 3D-positions of SURF feature-points between recording and current images obtained by a Kinect sensor. In conventional view-based navigations, it is difficult to plan another path when humans and objects are located on the recording path. By using our proposed estimation, it is possible to realize flexible path planning in actual environments which include humans and objects. From experimental results performed in actual indoor environments, we evaluated mesurement accuracy for position and rotation estimated by our method, and confirmed the possibility of our method for actual environments including humans and objects.
J-STAGEがリニューアルされました! https://www.jstage.jst.go.jp/browse/-char/ja/