Files

Abstract

Recent studies on multi-robot localization have shown that the uncertainty of robot location may be considerably reduced by optimally fusing odometry and the relative angles of sight (bearing) among the team members. However, the latter requires the capability for each robot of detecting the other members up to large distances and wide field of view. Furthermore, robustness and precision in estimating the relative angle of sight is of high importance. In this paper we show how all of the these requirements may be achieved by employing an omnidirectional sensor made up of a conic mirror and a simple webcam. We use different colored lights to distinguish the robots and optical defocusing to identify the lights. We show that defocusing increases the detection range up to several meters, compensating the decay of resolution related to the omnidirectional view, without losing robustness and precision. To allow a real time implementation of light tracking, we use a recent “tree-based union find” technique for color segmentation and region merging. We also present a self-calibration technique based on an Extended Kalman Filter to derive the intrinsic parameters of the robot-sensor system. The performance of the approach is shown through experimental results.

Details

PDF