A Robust Finger Tracking Method for Multimodal Wearable Computer Interfacing

Mobile wearable computers are intended to provide users with real-time access to information in a natural and unobtrusive manner. Computing and sensing in these devices must be reliable, easy to interact with, transparent, and configured to support different needs and complexities. This paper presents a vision-based robust finger tracking algorithm combined with audio-based control commands that is integrated into a multimodal unobtrusive user interface, wherein the interface may be used to segment out objects of interest in the environment by encircling them with the user's pointing fingertip. In order to quickly extract the objects encircled by the user from a complex scene, this unobtrusive interface uses a single head-mounted camera to capture color images, which are then processed using algorithms to perform: color segmentation, fingertip shape analysis, perturbation model learning, and robust fingertip tracking. This interface is designed to be robust to changes in the environment and user's movements by incorporating a state-space estimation with uncertain models algorithm, which attempts to control the influence of uncertain environment conditions on the system's fingertip tracking performance by adapting the tracking model to compensate for the uncertainties inherent in the data collected with a wearable computer


Published in:
IEEE Transactions on Multimedia, 8, 5, 956-972
Year:
2006
Publisher:
IEEE
ISSN:
1520-9210
Laboratories:




 Record created 2017-12-19, last modified 2018-12-03


Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)