Files

Abstract

In this paper an approach for localization using geometric features from a 360 degree laser range finder and a monocular vision system is presented. Its practicability under conditions of continuous localization during motion in real-time (referred to as on-the-fly localization) is investigated in large-scale experiments. The features are infinite horizontal lines for the laser and vertical lines for the camera. They are extracted using physically well-grounded models for all sensors and passed to a Kalman filter for fusion and position estimation. Positioning accuracy close to subcentimeter has been achieved with an environment model requiring 30 bytes per square meter. Already with a moderate number of matched features, the vision information was found to further increase this precision, particularly in the orientation. The results were obtained with a fully self-contained system where extensive tests with an overall length of more than 6.4 km and 150,000 localization cycles have been conducted. The final testbed for this localization system was the Computer 2000 event, an annual computer tradeshow in Lausanne, Switzerland, where during four days visitors could give high-level navigation commands to the robot via a web interface. This gave us the opportunity to obtain results on long-term reliability and verify the practicability of the approach under application-like conditions. Furthermore, general aspects and limitations of multisensor on-the-fly localization are discussed. Keywords: Mobile robot localization; On-the-fly localization; Position tracking; Multisensor data fusion; Kalman filtering

Details

Actions

Preview