Vision-based Navigation from Wheels to Wings
We describe an incremental approach towards the development of autonomous indoor flyers that use only vision to navigate in textured environments. In order to cope with the severe weight and energy constraints of such systems, we use spiking neural controllers that can be implemented in tiny micro-controllers and map visual information into motor commands. The network morphology is evolved by means of an evolutionary process on the physical robots. This methodology is tested in three robots of increasing complexity, from a wheeled robot to a dirigible to a winged robot. The paper describes the approach, the robots, their degrees of complexity, and summarizes results. In addition, three compatible electronic boards and a choice of vision sensors suitable for these robots are described in more details. These boards allow a comparative and gradual development of spiking neural controllers for flying robots.