Simulation is an important issue in the robotics research field since it is essential for evaluating and predicting the behavior of a robot. However previously developed simulation systems did not have the function of simulating the visual perception process of the robot. This means that a vision-based behavior cannot be simulated. In this paper, we first propose a novel approach using VR technology to simulate the vision-based behavior of a robot. Then the implemented system called “View Simulation System (VSS) ”, which was designed to simulate the behavior of our view-based navigation method is described. The VSS has a CG generator and a vision processing hardware, and can run the same program as one for our mobile robot. Finally the feasibility of VSS is shown through experiments using the real robot.
Recently, view-based or appearance-based approaches have been attracting the interests of computer vision research. Based on a similar idea, we have proposed a view-based navigation method using a model of the route called the“View Sequence.”It contains a sequence of frontal views along a route memorized in the recording run, and the recognition of the environment is realized based on the matching of the current view and memorized view sequence. In this paper, we discuss the required characteristics of the view for the view sequence, and evaluate our former method of generating views. Then we confirm that the stereo disparity satisfies the requirements of the view sequence through an experiment, and the disparity view sequence is applied for outdoor navigation. The experimental results indicate such views other than normal camera images can be utilized for our view-based navigation method.
In this paper we present an efficient algorithm for manipulating the zero moment point of the walking robots, and its application to controlling the angular momentum of walking robots. A remarkable feature of our control method is that the zero moment point is considered as an actuating signal for the controller. The proposed method is applicable in real time situations because it does not necessitate accurate joint angle tracking. Its application to walking robots will result in a smooth and soft motion. Experimental results, based on a theoretical explanation, verify the validity of the proposed method.
The performance indices of the model based motion control are experimentally investigated for a 6 D.O.F. industrial manipulator. Adopting the standardization compensators both in the joint space and the task space, a trajectory control for each direction is examined to evaluate the accuracy and decoupling indices of the model based controls. The frequency response for each direction is also investigated by the spectrum analysis of the step response.
For dexterously performing object grasping and manipulation with multifingered hand of robot, sensing the finger joint torque is required. In general, the size of finger joint is quite small so that it is hard to realize the torque sensing. This paper proposes a novel mechanism calledStrain-Deformation Expansion Mechanismto sense the joint torque, which is small enough and fitted in the finger joint. By the torque-sensing mechanism, the small joint strain deformation used for torque sensing can be expanded without reducing the joint stiffness. In this paper, the torque-sensing principle is addressed by analyzing the deformation of the sensing mechanism and the torques acting on the joint axis theoretically. Then, the sensitivity of the sensing mechanism and its expansion rate of sensitivity are defined, and a method for realizing the sensing mechanism with high sensitivity is discussed. Lastly, some experiments with robot finger are performed to show the basic characteristcs and the effectiveness of the proposed torque-sensing mechanism.
Flexibility of the mechanical system is one of the critical factors of the bandwidth limitation of the positioning control. In this paper, we consider a manipulator mounted on the base which contains flexibility. Based on the positive realness of the transfer matrix, we define 'Robust arm configuration' which is a kind of singular configuration where the system is passive. We show that the mechanical system obtains good robustness in the neighborhood of the robust arm configuration. The validity is confirmed applying to a flexible assembly system. The position of the base and the pallets are optimized and a high bandwidth positioning is realized.
In this paper, we propose a whole-body haptic interface for human symbiotic robots. First, tactile and force information are specified to accurately detect physical interferences with humans, on the entire surface of human symbiotic robots. Next, a design method of whole-body surface cover sensors which enables robots to detect accurate force vectors applied on various surfaces of the body is described. The basic structure of the cover sensor utilizes a force-torque sensor and is surrounded with several touch sensors. The cover structure was implemented on a humanoid robot, WENDY. Finally, experiments were carried out to verify the effectiveness of the proposed method as human-robot haptic interface. The results show high capabilities of locus tracking on the surface and arm motion-processing based on the detected tactile and force information, and indicate that the proposed design method for realizing whole-body haptic interface is capable of enhancing human-robot symbiosis.
This paper focuses on acquisition methods for the contact force distributions acting on the high-resolution tactile sensors during the object handlings. The prototypes of tactile sensors equipped on the fingers are realized by a method of the light conductive plate, which is made close contact with a sensing surface consisting of soft sensing materials. As an object comes in contact with the tactile sensors on the parallel fingers by grasping it, contact patterns on the tactile sensors are taken as tactual images by CCD camera, which is set up at the light conductive plate underneath. Acquisitions of the tactual images by the tactile sensors can be carried out dynamically for the grasped object at pseudo video rate. For grasping a square pillar, as external forces are applied on the end of the object, tactual images through the tactile sensors are taken at the rate of 200 [ms] and its force vectors acting on the finger are induced. Orientations of the object like a stick are discriminated on the tactile sensors from the momentum of the displacement distribution. Finally, Gaussian curvature and mean curvature of the fundamental local shapes of the objects are estimated from the tactual images.