Embedding Motion in Model-Based Stochastic Tracking

Particle filtering (PF) is now established as one of the most popular methods for visual tracking. Within this framework, two assumptions are generally made. The first is that the data are temporally independent given the sequence of object states, and the second one is the use of the transition prior as proposal distribution. In this paper, we argue that the first assumption does not strictly hold and that the second can be improved. We propose to handle both modeling issues using motion. Explicit motion measurements are used to drive the sampling process towards the new interesting regions of the image, while implicit motion measurements are introduced in the likelihood evaluation to model the data correlation term. The proposed model allows to handle abrupt motion changes and to filter out visual distractors when tracking objects with generic models based on shape representations. Experimental results compared against the CONDENSATION algorithm have demonstrated superior tracking performance.


Published in:
IEEE Transaction on Image Processing, 15, 11, 3514-3530
Year:
2006
Keywords:
Note:
IDIAP-RR 04-61
Laboratories:




 Record created 2006-03-10, last modified 2018-04-18

External link:
Download fulltext
Related documents
Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)