Robust finger tracking for wearable computer interfacing

Key to the design of human-machine gesture interface applications is the ability of the machine to quickly and efficiently identify and track the hand movements of its user. In a wearable computer system equipped with head-mounted cameras, this task is extremely difficult due to the uncertain camera motion caused by the user's head movement, the user standing still then randomly walking, and the user's hand or pointing finger abruptly changing directions at variable speeds. This paper presents a tracking methodology based on a robust state-space estimation algorithm, which attempts to control the influence of uncertain environment conditions on the system's performance by adapting the tracking model to compensate for the uncertainties inherent in the data. Our system tracks a user's pointing gesture from a single head mounted camera, to allow the user to encircle an object of interest, thereby coarsely segmenting the object. The snapshot of the object is then passed to a recognition engine for identification, and retrieval of any pre-stored information regarding the object. A comparison of our robust tracker against a plain Kalman tracker showed a 15% improvement in the estimated position error, and exhibited a faster response time.

Published in:
Proceedings of the 2001 workshop on Perceptive user interfaces - PUI, 1-5
Presented at:
2001 workshop on Perceptive user interfaces - PUI, Orlando, FL, USA, November 15-16, 2001

 Record created 2017-12-19, last modified 2018-09-13

Rate this document:

Rate this document:
(Not yet reviewed)