Online Modeling For Realtime Facial Animation
We present a new algorithm for realtime face tracking on commodity RGB-D sensing devices. Our method requires no user-specific training or calibration, or any other form of manual assistance, thus enabling a range of new applications in performance-based facial animation and virtual interaction at the consumer level. The key novelty of our approach is an optimization algorithm that jointly solves for a detailed 3D expression model of the user and the corresponding dynamic tracking parameters. Realtime performance and robust computations are facilitated by a novel subspace parameterization of the dynamic facial expression space. We provide a detailed evaluation that shows that our approach significantly simplifies the performance capture workflow, while achieving accurate facial tracking for realtime applications.
pubs_teaser.png
Thumbnail
openaccess
copyright
182.4 KB
PNG
885e6a50a586724cf832795c0c9f53ae
paper.pdf
postprint
openaccess
copyright
20.12 MB
Adobe PDF
91a528b49dbb7049dd6382c4b9524a4d