主催: The Japan Society of Mechanical Engineers
会議名: ロボティクス・メカトロニクス 講演会2019
開催日: 2019/06/05 - 2019/06/08
We utilize the depth visual information to present the surface detection/classification for the stable and energy-efficient locomotion of point-footed bipedal robot under the unstructured environment. The latest depth-vision systems offer very accurate 3D-point-clouds along with the RGB-images. The proposed method segments and classifies the 3D-images using machine-learning tools as random forests (RF), support-vector-machines (SVM) and relevance vector machines (RVM) for pixel-level classification and object-based image analysis (OBIA) to achieve accurate object segmentation. Contrary to the other existing methods, surface-recognition based robotic locomotion control is useful to estimate the optimal parameters to avoid unwanted-areas and to plan the collision-free energy-efficient walking. The major contribution of this work is to integrate the accurate surface modeling and locomotion rules for the bipedal walking control. Finally, we evaluate the results for average distance traveled and average energy consumption by the bipedal robot walking trajectory under different settings of unstructured environment.