Aerial SLAM with a Single Camera using Visual Expectation

Micro aerial vehicles (MAVs) are a rapidly growing area of research and development in robotics. For autonomous robot operations, localization has typically been calculated using GPS, external camera arrays, or onboard range or vision sensing. In cluttered indoor or outdoor environments, onboard sensing is the only viable option. In this paper we present an appearance-based approach to visual SLAM on a flying MAV using only low quality vision. Our approach consists of a visual place recognition algorithm that operates on 1000 pixel images, a lightweight visual odometry algorithm, and a visual expectation algorithm that improves the recall of place sequences and the precision with which they are recalled as the robot flies along a similar path. Using data gathered from outdoor datasets, we show that the system is able to perform visual recognition with low quality, intermittent visual sensory data. By combining the visual algorithms with the RatSLAM system, we also demonstrate how the algorithms enable successful SLAM.

Published in:
Proceedings of the 2011 IEEE International Conference on Robotics and Automation (ICRA), 1643 - 1649
Presented at:
2011 IEEE International Conference on Robotics and Automation (ICRA), Shanghai, May 9-13, 2011

 Record created 2013-01-31, last modified 2018-03-17

Download fulltext

Rate this document:

Rate this document:
(Not yet reviewed)