2017 年 22 巻 1 号 p. 71-80
In this research we propose a wearable suit for embodiment transformation, which virtually realizes a child's experience while preserving the user's interactions and perceptions. The embodiment transformation suit consists of a viewpoint translator and passive hand exoskeletons. The viewpoint translator simulates a child's point of view (POV) by using a pan-tilt stereo camera attached at the waist position and a head mounted display (HMD). The pan-tilt mechanism follows the user's head behavior. The passive hand exoskeletons simulate a child's tiny grasping motion by using multiple quadric crank mechanisms and a child-size rubber hand. Virtualized child's embodiment through our own body will provide opportunities to feel and understand a child's perception and recognition, to evaluate products and spaces such as hospitals, public facilities and homes from the aspect of universal design. This paper describes the system design and implementation of the viewpoint translator and the exoskeletons, and assessment of them based on user's feedback in exhibitions.