Lip-Sync in Human Face Animation based on Video Analysis and Spline Models
MetadataShow full item record
Human facial animation is an interesting and difficult problem in computer graphics. In this paper, a novel B-spline (NURBS) muscle system is proposed to simulate a 3D facial expression and talking animation. The system gets the lip shape parameters from the video, which captures a real person's lip movement, to control the proper muscles to form different phonemes. The muscles are constructed by the non-uniform rational B-spline curves, which are based on anatomical knowledge. By using different number of control points on the muscles, more detailed facial expression and mouth shapes can be simulated. We demonstrate the flexibility of our model by simulating different emotions and lip-sync to a video with a talking head using the automatically extracted lip parameters.
Proceedings of the 10th International Multimedia Modelling Conference MMM2004
© 2004 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.