Nonverbal skills are fundamental in determining the subjective impression of public speaking. Eye contact, in particular, is a special skill that requires presenters to switch eye gaze among multiple audience members while speaking, and training is essential to acquire this skill. However, there is little research on system or method that allows presenters to learn eye-contact skill sets. In addition, general training takes time to acquire skills that can be used in an actual environment. In this study, we propose an AR-based method that supports eye contact skills in real-time in an actual presentation scene, and also provides training effects through continuous use. Based on the analysis of various outstanding presenters’ behaviors, appropriate eye contact was defined, and we developed the proposed method for obtaining these behaviors by incorporating unique training strategies. Experiments were conducted with 15 participants to confirm the effectiveness of the proposed method in a simulated presentation environment for five days. As a result, both the supporting and training effects were partially confirmed by analyzing eye gaze data and a metric for presentation skills.
抄録全体を表示