抄録
Visually impaired persons have several burdens when applying makeup because it involves steps where visual information is required. However, many visually impaired persons are anxious about the makeup and tend to avoid the makeup activity even if they have mastered makeup techniques. We introduce a interface which confirms the makeup activity instead of the human assistant. Especially, we focus on the alert of unintentional makeup in lip area which affects the general impression of the face. We first design a novel interaction between a visually impaired person on a mobile platform. Next, we propose a processing method and the system configuration for realizing the proposed interaction. Finally, we discuss the feasibility of our method in daily makeup confirmation through experiments. In this paper, we propose an essential framework to support makeup based on the interaction using vibration and audio feedback.