Robust Articulated-ICP for Real-Time Hand Tracking
We present a robust method for capturing articulated hand motions in realtime using a single depth camera. Our system is based on a realtime registration process that accurately reconstructs hand poses by fitting a 3D articulated hand model to depth images. We register the hand model using depth, silhouette, and temporal information. To effectively map low-quality depth maps to realistic hand poses, we regularize the registration with kinematic and temporal priors, as well as a data-driven prior built from a database of realistic hand poses. We present a principled way of integrating such priors into our registration optimization to enable robust tracking without severely restricting the freedom of motion. A core technical contribution is a new method for computing tracking correspondences that directly models occlusions typical of single-camera setups. To ensure reproducibility of our results and facilitate future research, we fully disclose the source code of our implementation.
teaser.png
Thumbnail
openaccess
copyright
221.72 KB
PNG
d6ab15ee12aeacbc578831fa8437b918
new_video.mp4
n/a
openaccess
copyright
73.84 MB
Video MP4
d624a390290cdd8bb3641ea3ea2ce1d1