Humans recognize emotions from nonverbal clues such as voice quality and diversity of utterances, not the words themselves. Such nonverbal element is called "Paralanguage". In a previous study, we investigated utterance recognition by focusing on lip movements. However, we have seen decrease of the utterance recognition rate because of individual variation. Then, we have investigated utterance training based on lip movements to improve recognition rate and individual’s utterance. On the other hand, our eyes are used as predominant information in our daily life for instance, energetic and glaze over. In these processes, we have experienced that the lips opened differently due to daily physical condition and fatigue. From this experience we thought that physical condition can be evaluated psychophysically from image information such as eyes and lip movements. Our final goal of this research is early detection of diseases and appropriate treatment in medical and nursing care workplaces using our lip movement and eye movement analyzer. This time, subjects had given an easy calculation task to acquire involuntary eye movement. Also, subjects uttered Japanese sentences that are aimed to improve people's utterance. CFF value was used for check the visual fatigue. We discuss these results in this paper.
View full abstract