Journal of Robotics and Mechatronics
Online ISSN : 1883-8049
Print ISSN : 0915-3942
ISSN-L : 0915-3942
Special Issue on Developments and Learning from the World Robot Challenge
PYNet: Poseclass and Yaw Angle Output Network for Object Pose Estimation
Kohei FujitaTsuyoshi Tasaki
著者情報
ジャーナル オープンアクセス

2023 年 35 巻 1 号 p. 8-17

詳細
抄録

The issues of estimating the poses of simple-shaped objects, such as retail store goods, have been addresses to ease the grasping of objects by robots. Conventional methods to estimate poses with an RGBD camera mounted on robots have difficulty estimating the three-dimensional poses of simple-shaped objects with few shape features. Therefore, in this study, we propose a new class called “poseclass” to indicate the grounding face of an object. The poseclass is of discrete value and solvable as a classification problem; it can be estimated with high accuracy; in addition, the three-dimensional pose estimation problems can be simplified into one-dimensional pose-estimation problem to estimate the yaw angles on the grounding face. We have developed a new neural network (PYNet) to estimate the poseclass and yaw angle, and compared it with conventional methods to determine its ratio of estimating unknown simple-shaped object poses with an angle error of 30° or less. The ratio of PYNet (68.9%) is an 18.1 pt higher than that of the conventional methods (50.8%). Additionally, a PYNet-implemented robot successfully grasped convenience store goods.

著者関連情報

この記事は最新の被引用情報を取得できません。

© 2023 Fuji Technology Press Ltd.

This article is licensed under a Creative Commons [Attribution-NoDerivatives 4.0 International] license (https://creativecommons.org/licenses/by-nd/4.0/).
The journal is fully Open Access under Creative Commons licenses and all articles are free to access at JRM official website.
https://www.fujipress.jp/jrobomech/rb-about/#https://creativecommons.org/licenses/by-nd
前の記事 次の記事
feedback
Top