Abstract
This paper deals interactive leaning for behaviors of robot partners through multi-modal communication based on tough interface, accelerometers, and others. A robot partner is controlled based on multi-objective behavior coordination using collision avoidance, target tracing, wall following, and formation with other robots. A specific behavior can be trained through the gesture navigation using tough interface. We propose an interactive behavior learning method based on human gesture navigation. A robot extracts a trajectory pattern by spiking neural network, and performs unsupervised clustering by self-organizing map. Finally, we show several experimental results of the proposed method, and discuss the effectiveness of the proposed method.