Abstract
We propose a new method for analyzing dynamics of facial expressions to identify persons. In the proposed method, facial feature points are extracted using an Active Appearance Model in the first frame of continuous image sequences. They are then tracked using Lucas-Kanade based method. Next, an interval from the beginning time to the ending time of facial expression changes is extracted. Finally, a feature vector is obtained. In the identification phase, an input vector is classified by calculating the distance between an input vector and the reference vectors using DP matching. We show the effectiveness of the proposed method using smile image sequences from MMI Facial Expression Database.