Abstract
There has been a lot of researches on the estimation/characterization of human emotions by using communication channels such as facial expressions. However, most of this research has focused on extracting facial features for some specific emotions at specific situations because the difficulty of the general characterization. We have developed a system that can characterize an emotion of an e-Learning user by analyzing his/her facial expression and biometric signals. The criteria used to classify the eight emotions were based upon a time sequential subjective evaluation of emotions as well as a time sequential analysis of facial expressions and biometric signals. The average coincidence ratio between these discriminated emotions by using the criteria of emotion diagnosis and the time sequential subjectively evaluated emotions for ten e-Learning examinees was 71%. When only the facial expressions were observed, the coincidence ratio was 66%. This suggests the multi-modal emotion diagnosis is effectiveness for estimating an e-Learning user's emotions.