This study develops a self-reconfigurable modular robot “CHOBIE II,” which changes its configuration to adapts to the load condition. Self-reconfiguration of modular robots is performed by cooperative actions of the constituent modules. Thus, the important point is how to express behavioral rules which regulate actions of each module. In this paper, we propose a new method which expresses the rules with numerical parameters. By applying this method, the rule of CHOBIE II can be expressed by 32 parameters. This paper demonstrates three kinds of self-reconfigurations realized by adjusting the parameters. This paper also discusses practical operations of CHOBIE II enabled by the new method.
Robotic systems intended for domestic housekeeping or elderly care tasks will have interaction with humans and therefore demand an intrinsically safe design. Conventional robotic systems are driven by heavy, high power and high stiffness electric motors, making them unsafe for human coexistence. To overcome this problem, we have developed a robotic arm system using pneumatic muscle actuators, which is intrinsically safe given its low weight and pliant structure. In this paper we present the design of the robot arm and associated control system, showing that the robot has positioning and force control accuracy sufficient for daily domestic tasks.
We have developed the transfer system that consists of autonomous mobile robots and the multi-robot controller that controls these robots. One of the feature of this system is the efficiency improvement of transportation that has been achieved by the hybrid control with the individual intelligence of autonomous mobile robot and the entire intelligence of mutli-robot control. By introducing this system, a space for the belt conveyer can be reduced, person's traffic line can be secured, and the risk that the entire transfer system stops due to the trouble by only one sensor can be reduced. Moreover, it is possible to correspond to layout changes because the track construction is unnecessary, and the system can be available where the person exists in the transfer area. This system was introduced into BML, INC. as “Blood samples transportation robot system”, and these features were proven to be effective by keeping really operating 24 hours. In this paper, the practical use of the developed sysytem is described.
In view of safety issues, haptic sensing is inevitable for human support robots. Furthermore, haptic communication based on haptic sensing technology has high potential as a part of multi-modal communication. This paper therefore proposes a method of command recognition based on haptic interfaces. When an operator runs a finger across the haptic interface, the command is recognized based on some feature quantities derived from the contact trajectory. The command recognition method accomplishes a reliable emergency stop function by utilizing force information. Some experimental results show that the efficiency of robot operation is enhanced since multi-dimensional information can be transmitted by a single intuitive motion.
This paper proposes a model of approach behavior with which a robot can initiate conversation with people who are walking. We developed the model by learning from the failures in a simplistic approach behavior used in a real shopping mall. Sometimes people were unaware of the robot's presence, even when it spoke to them. Sometimes, people were not sure whether the robot was really trying to start a conversation, and they did not start talking with it even though they displayed interest. To prevent such failures, our model includes the following functions: predicting the walking behavior of people, choosing a target person, planning its approaching path, and nonverbally indicating its intention to initiate a conversation. The approach model was implemented and used in a real shopping mall. The field trial demonstrated that our model significantly improves the robot's performance in initiating conversations.
In this paper, we present our current research on developing a model of robot behavior that leads to feelings of “being together” using the robot's body position and orientation. Creating feelings of “being together” will be an essential skill for robots that live with humans and adapt to daily human activities such as walking together or establishing joint attention to information in the environment. We observe people's proxemic behavior in joint attention situations and develop a model of behavior for robots to detect a partner's attention shift and appropriately adjust its body position and orientation in establishing joint attention with the partner. We experimentally evaluate the effectiveness of our model, and our results demonstrate the model's effectiveness.
International Space Station and other building project in the lower orbit rises the demand of EVA(extra vehicular activity), and due to this, the robots that helps EVA is highly expected. In this paper, we clear the development and designing requests of EVA Supporting Robot and its dexterous hand. The dexterous hand used in the EVA must be small enough to use EVA tools, able to output high grasping power and modularized so that the hand can be exchanged depending on the mission. Following to the requests, we designed a test model of dexterous hand. In this paper, new index robot finger for our dexterous hand is introduced. The finger consists of 1 actuator and 1 passive joint. The architecture and features of this finger are shown in the last half of this paper.
To improve face-to-face interaction with robots, we developed a model for generating interactive facial expressions by using a simple recurrent network (SRN). Conventional models for robot facial expression use predefined expressions, so only a limited number of expressions can be presented. This means that the expression may not match the interaction and that the person may find the expressions monotonous. Both problems can be overcome by generating expressions dynamically. We tested this model by incorporating it into a robot and comparing the expressions generated with those of a conventional model. The results demonstrated that using our model increases the diversity of face-to-face interaction with robots.