The Journal of the Institute of Image Electronics Engineers of Japan
Online ISSN : 1348-0316
Print ISSN : 0285-9831
ISSN-L : 0285-9831
Papers
Real-time Talking Head System Based on Principal Component Analysis
Takaaki KURATATEKeisuke KINOSHITA
Author information
JOURNAL FREE ACCESS

2005 Volume 34 Issue 4 Pages 336-343

Details
Abstract

In this paper we describe an animation system that can map a person's facial motion to a wide selection of realistic face models in real-time. The motion is obtained from a motion capture system that measures 3D positions of infra-red LED markers placed on a subject's face. Using a 3D laser scanner, we also scan nine predefined postures specific to speech production for the same subject. Target faces are generated from 3D mesh points also measured with a laser scanner. Transformation between the motion and static postures is computed based on linear mapping and PCA (Principal Component Analysis). With this method only a small number of parameters are required to generate facial animation: three parameters corresponding to the dominant principal components to control face motion, and six parameters to control rigid head motion. By reducing the parameter space and distributing the processing between two networked computers, motion capture processing, parameter transformation, and high-quality realistic facial animation synthesis is made possible in real-time.

Content from these authors
© 2005 by the Institute of Image Electronics Engineers of Japan
Previous article Next article
feedback
Top