This paper describes a new mechanism of a floor cleaning robot and its control system for autonomous navigation. The robot can be equipped with a cleaning function; scrubbing or sweeping. The controller memorizes the positions of obstacles by repeating longitudinal traverse and swivel 90 [deg] motion. The robot moves to the back of the obstacles and cleans up all area using this information.
In this paper, we propose an implemented architecture that integrates possession of knowledge about environment into a reactive, behavior-based autonomous robot. As an alternative to hybrid methods that are made up of behavior-based and model-based system combined together, we present a fully behavior-based system that can take reactive and deliberate behavior. In order to get knowledge, a robot makes a“World Image”, which is gradually constructed through experience as higher animals with self-organizing property do, without symbolic representation on associative memory. Since we make the“World Image”to have the property of changing and directly related to behavior, a robot can navigate adaptively and reactively to its destination following a path plan based on the “World Image”of a dynamic environment. The path plan a robot makes is not a sequence of routes like traditional planning, but flexible like reactive planning. If a robot detects failure, it can get its destination without replanning or error-recovery. The approach we present gives practicability to a behavior-based robot that already has adaptability in the real world.
Velocity ripples at each reference input time interval occur in the contour control of mechatronic servo systems. The stational velocity ripples were analysed by introduing an appropriate analytical model of the mechatronic servo system, and the analytical result of the velocity ripples was evaluated based on an experiment of an industrial DC servo motor. The experimental results gave important information for designing the controller for mechatronic servo systems.
We present design and implementation of a tactile sensor system, sensor suit, that covers the entire body of a robot. The sensor suit is designed to be soft and flexible and to have a large number of sensing regions by using electrically conductive fabric and string. The whole signals from the sensor suit are superimposed on a visual image of the robot. The construction of sensor suit with 160 sensing elements for a full-body humanoid and the experimental results to evaluate electrically conductive fabric and the tactile sensing unit are described.
We propose a decentralized control algorithm of multiple robots handling a single object in coordination, by which each robot is controlled by its own controller. The motion command of the object is given to one of the robots, and the other robots estimate the motion of the leader by themselves through the motion of the object and handle the object based on the estimated reference. The external force applied to the object, including the inertia force, is shared equally by all of the robots. The proposed control algorithm was experimentally applied to three mobile robots, each of which has one degree of freedom. The experimental results illustrated the validity of the proposed control system.
A robust profile sensor based on pulsed-laser beam scanning and the triangulation principle is presented. Interference from external light such as welding arc and spatters is eliminated by subtracting a CCD line sensor output in a turned-off period of laser beam projection from a sensor output in the adjacent tunred-on period. Multiple reflection of the laser beam from workpiece surfaces is reduced by using synchronous scanning of the projected beam and the reflected light. These methods realize taking a correct profile of a shiny workpiece and measuring a seam position near a welding torch. Robotic arc welding of a lap joint of 1.6 [mm] -thick-steel (SPCC) plates has been successfully performed using the prototype sensor.
This paper describes a design methodology for a human-symbiotic robot manipulator which mainly focuses on guarantee of human's safety and abilities of human-robot collaboration. In this paper, we classify the safety design parameters into four categories, such as shock absorption materials, motion controls, collision statements, and safety indices. First, physical models of shock absorption covers, realization of emergency stop and passive impedance control, experimental conditions of generalized human-robot collision, and severity indices for producing moderate injury are described. Next, human-robot collisions are simulated through a combination of there parameters, and the effects of each parameter on anticollision safety are arranged. Then, the several remarks for deciding design parameters are conducted from the experiment and simulation results. As the summarization of this paper, a safety design and control methodology for human symbiotic robot manipulator is proposed by synthesizing these remarks.
The problem of impact dynamics of robot manipulators which make instantaneous joint motion or displacement with some disturbance torque, is addressed. Due to unknown joint effect, like back-drivability or frictional displacement, and dynamic coupling between the manipulator and its supporting base, the motion of the system is very difficult to predict after making impulsive contact with environment. A method that uses the Extended Inversed Inertia Tensor and the Virtual Rotor Inertia is proposed to estimate the motion of such systems after impact. The method covers any class of rigid manipulator arms supported by a fixed-base, a flexible deployable structure, or a free-floating satellite. The experiments using the MIT Vehicle Emulation System (VES II) are carried out to observe the impact behavior and identify the values of the Virtual Rotor Inertia and the restitution coefficient. The results clearly show that the effective mass we can feel during impact varies depending on manipulator configuration and joint conditions, but there are consistent values for the Virual Rotor Inertia. Once those values are identified, we can predict force impulse and post impact velocity with relatively high fidelity.
In this paper, we consider dexterous manipulation through rolling motion. This problem is to steer an object by robot fingers using rolling motion and achieve the desired configuration inside the hand. The kinematic constraint of rolling is nonholonomic constraint, and it is known that designing a controller or path planning for such system is difficult. We consider a simple model of this problem which is a sphere held between two parallel plates and propose a method to design a feedback controller to achieve the desired orientation of the sphere. First step is to apply state and input transformation to the system and transform it into the form called 'the time-state control form'. Then by approximately linearizing it, we design a feedback controller using ordinary linear control theory. One state appeared to be uncontrollable, but by taking new coordinate system, we could make all the states to the desired point. Through numerical simulation, we will show that it is possible to manipulate a sphere to the desired configuration using this method.
This paper presents the development of 3-fingers multi-sensory hand which was integrated in a precise space telerobotic system for space testing aboard the Engineering Test Satellite VII. First, we discuss requirements for this hand. This hand must measure positions of works, to approach. We chose a hand eye camera and three proximity range finders to measure positions. The hand must grip, mate and demate works safely. Therefore, it is equipped with a wrist compliance device, a displacement sensor, a finger module and two grip force sensors. Secondary, we present the mechanism and sensor devices. The finger module consists of one linear-movement finger and two rotary fingers with the grip force sensors. The wrist compliance device absorbs 4-DOF position errors and the displacement sensor measures 3-DOF position errors while the hand is performing tasks. Finally, We report how mechanism and sensor devices are used and deal with the hand design considering the conditions of launching and space environment.