Multi modal tracking in complex environments for augmented reality applications

This work presents a model based real-time method for tracking 3D objects with a monocular camera. Our goal is to implement a tracking algorithm with a non conventional robustness in order to be used in hostile environments. We achieve this goal by integrating many sources of information. Our method combines the absolute information given by some reference images to the relative information provided by the previous frames. This prevents the error accumulation, which makes the tracker drift and the low precision, which makes the tracker jitter. The algorithm also combines the information provided by edges with the one brought by feature points. This lets the tracker handle both textured and untextured objects. In order to exploit these techniques, we needed to solve some fundamental issues. Even though consecutive frames in a sequence are very similar to each other, they are very different from the offline reference images. We solved this problem by merging our original, wide-baseline matching with the keyframes to the more conventional, narrow baseline matching with the previous frames. Tracking edges in the presence of cluttered or textured backgrounds is far from trivial because of the many spurious edges that bedevil typical edge-detectors. We overcome this difficulty by handling in real-time multiple hypotheses for potential edge-locations. This results in a real-time 3-D tracking algorithm that exploits both texture and edge information without being sensitive to misleading background information or light changes and that does not drift or jitter over time.


Advisor(s):
Fua, Pascal
Year:
2004
Publisher:
Lausanne, EPFL
Other identifiers:
urn: urn:nbn:ch:bel-epfl-thesis3114-6
Laboratories:


Note: The status of this file is: EPFL only


 Record created 2005-03-16, last modified 2018-12-05

Texte intégral / Full text:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)