Abstract
As robots become more ubiquitous in our daily lives, humans and robots are working in ever-closer physical proximity to each other. These close physical distances change the nature of human robot interaction considerably. First, it becomes more important to consider safety, in case robots accidentally hit the humans. Second, touch feedback from humans can be a useful additional channel for communication, and is a particularly natural one for humans to utilize. Covering the whole robot body with malleable tactile sensors can help to address the safety issues while providing a new communication interface. In this paper, we discuss attempts to solve some of the difficult new technical and information processing challenges presented by flexible touch sensitive skin. Our approach is based on locality of haptic features for classification of touch interactions. We hypothesize that useful haptic features are composed from local sensor output pairs. We found that using sparse sensor pairs containing as little as 15% of the full sensor combination set it is possible to classify interaction scenarios with accuracy up to 80% in a 15-way forced choice task. Visualizations of the learned subspaces show that, for many categories of touch, the learned sensor pairs are composed mainly of physically local sensor groups as we hypothesized.