Embedding Motion in Model-Based Stochastic Tracking
Particle filtering is now established as one of the most popular method for visual tracking. Within this framework, two assumptions are generally made. The first is that the data are temporally independent given the sequence of object states. In this paper, we argue that in general the data are correlated, and that modeling such dependency should improve tracking robustness. The second assumption consists of the use of the transition prior as proposal distribution. Thus, the current observation data is not taken into account, requesting the noise process of this prior to be large enough to handle abrupt trajectory changes. Therefore, many particles are either wasted in low likelihood area, resulting in a low efficiency of the sampling, or, more importantly, propagated on near distractor regions of the image, resulting in tracking failures. In this paper, we propose to handle both issues using motion. Explicit motion measurements are used to drive the sampling process towards the new interesting regions of the image, while implicit motion measurements are introduced in the likelihood evaluation to model the data correlation term. The proposed model allows to handle abrupt motion changes and to filter out visual distractors when tracking objects with generic models based on shape or color distribution representations. Experimental results compared against the CONDENSATION algorithm have demonstrated superior tracking performance.
Published in International Conference on Pattern Recognition (ICPR), 2004
Record created on 2006-03-10, modified on 2016-08-08