Abstract
This paper discusses an intuitive interface for a care receiver with limited mobility to teach a life-support robot in a cluttered living environment. With a TOF (time-of-flight) laser sensor on a pan-tilt actuator handled by a PC mouse, the care receiver can “click” a physical object to be manipulated by the robot. By this “real world click”, 3D coordinates of the object can be obtained by measuring the pan-tilt actuator's angles and the laser beam length. This technology enables a seamless “drag & drop” operation between PC world and physical world and automatically generates an instruction for a robot. The care receiver can look all around the room through the image obtained from a camera near the laser sensor. We implemented an experimental “real world click” interface by which the usability and the capability were evaluated in terms of the throughput and the measurement accuracy. Finally we applied our proposed method to a mobile robot with a multi-degree of freedom manipulator and confirmed its validity in conveying tasks.