3D Spectral Nonrigid Registration of Facial Expression Scans

In this paper, we introduce a new template-based spectral nonrigid registration method in which the target is represented using multilevel partition of unity (MPU) implicit surfaces and the template is embedded in a discrete Laplace-Beltrami based spectral representation using the manifold harmonics transform (MHT). The implicit surface parametrization of the target allows us to avoid computing correspondences during the registration as in classical nonrigid iterative closest point (ICP) techniques. It also allows us to denoise the 3D scans and fill the holes by interpolating the noisy 3D data and to incorporate different types of 3D surfaces into our model, independently of their original parametrization. We take advantage of spectral geometry processing methods to compute a spectral embedding of the template and use it as a parametric surface deformation model. We optimize the nonrigid deformation directly in the spectral domain, thus effectively reducing the size of the parameter space as compared with the classical per vertex affine transformation deformation model. In addition, we introduce a new 3D facial expressions database, EPFL3DFace, on which we apply the proposed method to nonrigidly register 3D face scans that contain different expressions. This database consists of 3D scans of 120 subjects posing 35 different facial expressions. These include various standard prototypical facial expressions as well as individual action units, visemes, and the facial movement of biting one’s own upper lip, which are suitable for a large variety of applications.

Published in:
IEEE Transactions on Visualization and Computer Graphics
Institute of Electrical and Electronics Engineers

 Record created 2017-04-29, last modified 2018-09-13

Download fulltext

Rate this document:

Rate this document:
(Not yet reviewed)