Résumé

For augmented reality applications, accurate estimation of the camera pose is required. An existing video-based marker tracking system developed at the ITS presents some limitations when the marker is completely or partially occluded. The objective of this project is to improve the tracking of the camera on those cases. For this purpose, the tracking of the marker is extended to that of natural feature points around the marker's environment. The extension to natural features works as follows. Feature point candidates are being searched in the image. Depth estimation is implemented in a recursive method triangulating points between the first frame and subsequent frames. A particle filter is used to take into account the new measures received with every frame. Particle's weights are sensible to the state of the camera tracker and the proximity of the triangulated point to the expected depth. Eventually the distribution of the depth particle filter for a given feature converges at one point. This position is added to a map of feature points used to update the camera pose. Results show that tracking was improved. The camera pose was tracked even though the marker was not visible. However a small drift in its pose is detected due to the imprecision of feature point positions. This drift is not excessive in the sense that the tracker is able to recover the right camera pose. Future work will focus on improving the accuracy of the depth estimates.

Détails

Actions