Robot Navigation by Panoramic Vision and Attention Guided Features

In visual-based robot navigation, panoramic vision emerges as a very attractive candidate for solving the localization task. Unfortunately, current systems rely on specific feature selection processes that do not cover the requirements of general purpose robots. In order to fulfill new requirements of robot versatility and robustness to environmental changes, we propose in this paper to perform the feature selection of a panoramic vision system by means of the saliency-based model of visual attention, a model known for its universality. The first part of the paper describes a localization system combining panoramic vision and visual attention. The second part presents a series of indoor localization experiments using panoramic vision and attention guided feature detection. The results show the feasibility of the approach and illustrate some of its capabilities.

Published in:
Proceedings of the International Conference on Pattern Recognition (ICPR), 695-698
Presented at:
International Conference on Pattern Recognition (ICPR), August 20-24, 2006

Note: The status of this file is: EPFL only

 Record created 2011-07-28, last modified 2018-11-26

Download fulltext

Rate this document:

Rate this document:
(Not yet reviewed)