Abstract
Autonomous robots that replace humans are currently in more demand in the manufacturing sector worldwide. However, they cannot be used in random or unknown environments because most of them are controlled by only sequence controls. Two cameras are needed as vision sensors that can recognize the outside environment to solve this problem. When the vision sensors are fitted to a material handling system, cameras are needed to accurately recognize the shape and position of an object from the image information. In this research, we aim at improving the camera calibration accuracies in order to accurately calculate the shape and position of an object from the image information obtained by camera. These accuracies contain the 2D and 3D calibration accuracies. It is known that these calibration accuracies are affected by the calibration methods, environmental conditions, and positions of the two cameras. Therefore, we propose the experimental optimization method for the two cameras' positions and angles. First of all, we examine the influences of the two cameras' positions and angles for the calibration accuracies. In addition, we determine the optimal ranges of the factors that affect the calibration accuracies and make a choice of optimal camera position and angle.