ABSTRACT
Talking virtual characters are graphical simulations of real or imaginary persons capable of human-like behavior, most importantly talking and gesturing. They may find applications on the Internet and mobile platforms as newscasters, customer service representatives, sales representatives, guides etc. After briefly discussing the possible applications and the technical requirements for bringing such applications to life, we describe our approach to enable these applications: the Facial Animation Framework. This framework consists of (1) a lightweitht, portable, MPEG-4 compatible Facial Animation Player, (2) a system for fast production of ready-to-animate, MPEG-4 compatible face models and (3) a plethora of MPEG-4 compatible tools for Facial Animation content production. We believe that this kind of approach offers enough flexibility to rapidly adapt to a broad range of applications involving facial animation on various platforms.
- J. Ahlberg, "Using the Active Appearance Algorithm for Face and Facial Feature Tracking," 2nd International Workshop on Recognition, Analysis and Tracking of Faces and Gestures in Realtime Systems (RATFFG-RTS), pp. 68--72, Vancouver, Canada, July 2001. Google ScholarDigital Library
- A.L.I.C.E. natural language A.I. parser and chat robot, www.alicebot.orgGoogle Scholar
- Kiyoshi Arai, Tsuneya Kurihara, Ken-ichi Anjyo, "Bilinear interpolation for facial expressions and methamrphosis in real-time animation", The Visual Computer, 12:105--116, 1996.Google ScholarCross Ref
- M.M.Cohen and D.W.Massaro, "Modeling Coarticulation in Synthetic Visual Speech." In M.Thalmann & D.Thalmann (Eds.) Computer Animation'93. Tokyo: Springer-Verlag.Google Scholar
- Cosatto E., Graf H.P., "Sample-Based Synthesis of Photo-Realistic Talking Heads", Proc. Computer Animation '98, Philadelphia, USA, pp. 103--110. Google ScholarDigital Library
- Chadwick, "Layered construction for deformable animated characters", Computer Graphics, 23(3):234--243,1989. Google ScholarDigital Library
- P. Eisert, S. Chaudhuri and B. Girod, "Speech Driven Synthesis of Talking Head Sequences," 3D Image Analysis and Synthesis, pp. 51--56, Erlangen, November 1997.Google Scholar
- M. Escher, I.S. Pandzic, N. Magnenat-Thalmann, "Facial Deformations for MPEG-4", Computer Animation 98, Philadelphia, USA, pp. 138--145, IEEE Computer Society Press, 1998. Google ScholarDigital Library
- Robert Forchheimer and Olov Fahlander, "Low Bit-rate Coding through Animation", Proceedings Picture Coding Symposium 83.Google Scholar
- Robert Forchheimer, Olov Fahlander and Torbj¿rn Kronander, "A Semantic Approach to the Transmission of Face Images," Proceedings Picture Coding Symposium 84.Google Scholar
- ISO/IEC 14496 - MPEG-4 International Standard, Moving Picture Experts Group, www.cselt.it/mpegGoogle Scholar
- Kalra P., Mangili A., Magnenat-Thalmann N., Thalmann D., Simulation of Facial Muscle Actions based on Rational Free Form Deformation", Proceedings Eurographics 92, pp. 65--69.Google Scholar
- N. Magnenat-Thalmann, N.E. Primeau, D. Thalmann, "Abstract muscle actions procedures for human face animation", Visual Computer, 3(5):290--297, 1988.Google ScholarCross Ref
- Jun-yong Noh, Ulrich Neumann, "Expression Cloning", Proceedings of SIGGRAPH 2001, Los Angeles, USA. Google ScholarDigital Library
- Igor S. Pandzic, Joern Ostermann, David Millen, "Synthetic Faces: What are they good for?", The Visual Computer, 1999.Google Scholar
- Igor S. Pandzic, Gael Sannier, "From Photographs to Interactive Virtual Characters on the Web", Proc. Scanning 2000, Paris, France.Google Scholar
- Igor S. Pandzic, "Life on the Web", Software Focus Journal, 2(2):52--59, John Wiley & Sons, 2001.Google ScholarCross Ref
- I.S. Pandzic,"A Web-Based MPEG-4 Facial Animation System", Proc. ICAV 3D 2001, demonstration at www.icg.isy.liu.se/~igor/MpegWebGoogle Scholar
- F.I. Parke, "A Parametric Model for Human Faces", PhD Thesis, University of Utah, Salt Lake City, USA, 1974. UTEC-CSc-75-047 Google ScholarDigital Library
- F.I. Parke, "Parametrized models for facial animation", IEEE Computer Graphics and Applications, 2(9):61--68, November 1982.Google ScholarDigital Library
- F.I. Parke, K. Waters, "Computer Facial Animation", A K Peters Ltd. 1996., ISBN 1-56881-014-8 Google ScholarDigital Library
- Pearson, "Development in Model-Based Video Coding", Proc. of the IEEE, 83(6):892--906, June 1995.Google ScholarCross Ref
- S.M. Platt, N.I. Badler, "Animating Facial Expressions", Computer Graphics, 15(3):245--252, 1981. Google ScholarDigital Library
- Quartz Version 6.0, Symbian Technical Paper, Symbian Developer Network, www.symbiandevnet.com/techlib/techcomms/techpapers/papers/v6/over/quartz/index.htmlGoogle Scholar
- Shout 3D, Eyematic Interfaces Incorporated, http://www.shout3d.com/Google Scholar
- VRML, ISO/IEC 14772-1:1999, www.web3d.org/fs_specifications.htmGoogle Scholar
- Tekalp M.A., Ostermann J., "Face and 2-D Mesh Animation in MPEG-4", Image Communication Journal, Tutorial Issue on MPEG-4 Standard, Elsevier, 2000.Google Scholar
- D. Terzopoulos, K. Waters, "Physically-based facial modeling, analysis and animation", Journal of Visualization and Computer Animation, 1(4):73--80, 1990.Google ScholarCross Ref
- Weizenbaum, J., "ELIZA - A computer program for the study of natural language communication between man and machine", Communications of the ACM 9(1):36--45, 1966. Google ScholarDigital Library
- K. Waters, "A muscle model for animating three-dimensional facial expressions", Computer Graphics (SIGGRAPH'87), 21(4):17--24, 1987. Google ScholarDigital Library
- W Interactive SARL, www.winteractive.frGoogle Scholar
Index Terms
- Facial animation framework for the web and mobile platforms
Recommendations
Facial motion cloning
We propose a method for automatically copying facial motion from one 3D face model to another, while preserving the compliance of the motion to the MPEG-4 Face and Body Animation (FBA) standard. Despite the enormous progress in the field of Facial ...
Facial animation retargeting and control based on a human appearance space
Expressive facial animations are essential to enhance the realism and the credibility of virtual characters. Parameter-based animation methods offer a precise control over facial configurations while performance-based animation benefits from the ...
Position-based facial animation synthesis
We propose an integrated facial dynamics model addressing the animation of 3D humanoid faces in real time. The computational model mimics facial motion by reproducing the layered anatomical structure of a human head including the bony structure, ...
Comments