Abstract
Body gestures and postures are important for non-verbal communication. Emotions play an important role in communicating with human-being and artificial agents and robotics. There are many emotion models based on facial expressions. However, models for body postures are little established for real-time estimation. The pressure sensors on a chair and accelerometers on the body can predict the emotion factors. But the sensors are attached on the body, so a method is necessary for not attaching the devices on the body. We introduce Kinect which consists of remote sensors to measure the body posture. We find the standing postures have two semantic factors: “arousal” and “avoidance”. We construct and evaluate the emotion estimation system in real-time that measures the positions of body parts (a head, hands, arms, shoulders, a body trunk, legs, and feet) and models the relationship between the positions and emotions by regression equation models for each emotion. This paper contributes not only to building a real-time emotion judgment system applied to robots but also to analyzing emotions of standing postures by others.