1 Expression Cloning Jung-yong Noh Ulrich Neumann Siggraph01
Dec 21, 2015
11
Expression Cloning
Jung-yong Noh Ulrich Neumann
Siggraph01
22/21/21
Introduction (1/2)
What is Expression Cloning? Allow animations to be easily retargeted to new models
Why Expression Cloning? Easily create facial animations for character models Provide an alternative from scratch
33/21/21
Introduction (2/2)
How to do Expression Cloning?Transfer vertex motion vectors from a source
face model to a target model1. Determine surface points correspondence
2. Transfer motion vectors
44/21/21
Related works (1/4)
Two kinds of facial animation approaches
1. Physical behaviors of the bone and muscle structures
A Muscle Model for Animating Three-Dimensional Facial Expression, K. Waters et al [31]
55/21/21
Related works (2/4)
Two kinds of approaches2. Smooth surface deformation
Animation do not simply transfer between models
Making Faces, H. Malvar et al [31]
66/21/21
Related works (3/4)
Reusing data for new modelsVector based muscle models
Placing heuristic muscles under the surface of the face
Repeat for each new model
A parametric approachAssociating the motion of a group of vertices to a
specific parameterManual association must be repeated for models
77/21/21
Related works (4/4)
The goal of this paperReusing motion data to produce facial
animationsSame qualitiesEasily transformControl varied target models from one generic
model
Similar workPerformance driven facial animation, MPEG-4
Tracking a live actor; 84 feature points
88/21/21
Expression Cloning (1/11)
Two steps:
1. Dense surface correspondences
2. Animation with motion vectors
99/21/21
Expression Cloning (2/11)
1. Dense surface correspondences Determine which surface points in the
target correspond to vertices in the source model Different number of vertices or
connectivity Small set of initial correspondences to
establish an approximate relationship
1010/21/21
Expression Cloning (3/11)
1. Dense surface correspondences Radial Basis Functions (RBF)
Roughly project vertices in the source model onto the target model
Cylindrical Projections
1111/21/21
Expression Cloning (4/11)
1. Dense surface correspondences
1212/21/21
Expression Cloning (5/11)
2. Animation with Motion Vectors Displace each target vertex to match the
motion of a corresponding source surface point. Need Dense source motion vectors,
linear interpolation Direction and magnitude of a motion
vector must be altered and scaled
1313/21/21
Expression Cloning (6/11)
2. Animation with Motion Vectors
2.1 Motion Vector Direction Adjustment
2.2 Motion Vector Magnitude Adjustment
1414/21/21
Expression Cloning (7/11)
Direction Adjustment
1515/21/21
Expression Cloning (8/11)
Magnitude Adjustment
1616/21/21
Expression Cloning (9/11)
Direction Adjustment & Magnitude Adjustment
▪ m : motion vector ▪ Local bounding box (BB), scale and rotate▪ limit by a global threshold
1717/21/21
Expression Cloning (10/11)
Lip contact line Models have lips that touch at a contact line Lower lip vertices may be controlled by
upper lip triangle Solve:
Include all the source-model lip contact line vertices for the RBF morphing step
Completely align the lip contact lines of the two models
1818/21/21
Expression Cloning (11/11)
Automated Correspondence Selection A small set of correspondences is
needed for the RBF morphing 15 heuristic rules when applied to
human faces
1919/21/21
Results
Animation can be created by motion capture data
Wide variety of target models
2020/21/21
Conclusion
Expression cloningUse high-quality dense 3D data in source
model animations Produce animations of different models with
similar expressionsThe method is fast and produces real time
animations.
2121/21/21
Future Works
Stick figures and cartoonsUse sparse source data without loss of
expressive animation qualityControl knobs
To amplify or reduce a certain expressionTongue and teeth model