Tagliasacchi, AndreaSchroeder, MatthiasTkach, AnastasiaBouaziz, SofienBotsch, MarioPauly, Mark2015-12-022015-12-022015-12-02201510.1111/cgf.12700https://infoscience.epfl.ch/handle/20.500.14299/120940WOS:000362293300010We present a robust method for capturing articulated hand motions in realtime using a single depth camera. Our system is based on a realtime registration process that accurately reconstructs hand poses by fitting a 3D articulated hand model to depth images. We register the hand model using depth, silhouette, and temporal information. To effectively map low-quality depth maps to realistic hand poses, we regularize the registration with kinematic and temporal priors, as well as a data-driven prior built from a database of realistic hand poses. We present a principled way of integrating such priors into our registration optimization to enable robust tracking without severely restricting the freedom of motion. A core technical contribution is a new method for computing tracking correspondences that directly models occlusions typical of single-camera setups. To ensure reproducibility of our results and facilitate future research, we fully disclose the source code of our implementation.Robust Articulated-ICP for Real-Time Hand Trackingtext::journal::journal article::research article