Combination of video-based camera trackers using a dynamically adapted particle filter

This paper presents a video-based camera tracker that combines marker-based and feature point-based cues in a particle filter framework. The framework relies on their complementary performance. Marker-based trackers can robustly recover camera position and orientation when a reference (marker) is available, but fail once the reference becomes unavailable. On the other hand, feature point tracking can still provide estimates given a limited number of feature points. However, these tend to drift and usually fail to recover when the reference reappears. Therefore, we propose a combination where the estimate of the filter is updated from the individual measurements of each cue. More precisely, the marker-based cue is selected when the marker is available whereas the feature point-based cue is selected otherwise. The feature points tracked are the corners of the marker. Evaluations on real cases show that the fusion of these two approaches outperforms the individual tracking results. Filtering techniques often suffer from the difficulty of modeling the motion with precision. A second related topic presented is an adaptation method for the particle filer. It achieves tolerance to fast motion manoeuvres.

Publié dans:
Proc. 2nd International Conference on Computer Vision Theory and Applications (VISAPP07), 363--370
Présenté à:
2nd International Conference on Computer Vision Theory and Applications (VISAPP07), Barcelona, Spain, 8 - 11 March

 Notice créée le 2007-01-12, modifiée le 2019-12-05

Télécharger le documentPDF
Lien externe:
Télécharger le documentURL
Évaluer ce document:

Rate this document:
(Pas encore évalué)