Briod, AdrienZufferey, Jean-ChristopheFloreano, Dario2016-02-012016-02-012016-02-01201610.1007/s10514-015-9494-4https://infoscience.epfl.ch/handle/20.500.14299/122943WOS:000374253200002We aim at developing autonomous miniature hovering flying robots capable of navigating in unstructured GPS-denied environments. A major challenge is the miniaturization of the embedded sensors and processors that allow such platforms to fly by themselves. In this paper, we propose a novel ego-motion estimation algorithm for hovering robots equipped with inertial and optic-flow sensors that runs in real- time on a microcontroller and enables autonomous flight. Unlike many vision-based methods, this algorithm does not rely on feature tracking, structure estimation, additional dis- tance sensors or assumptions about the environment. In this method, we introduce the translational optic-flow direction constraint, which uses the optic-flow direction but not its scale to correct for inertial sensor drift during changes of direction. This solution requires comparatively much sim- pler electronics and sensors and works in environments of any geometry. Here we describe the implementation and per- formance of the method on a hovering robot equipped with eight 0.65 g optic-flow sensors, and show that it can be used for closed-loop control of various motions.Flying RobotsEgomotion EstimationOptic-flowFlight StabilisationAerial RoboticsA method for ego-motion estimation in micro-hovering platforms flying in very cluttered environmentstext::journal::journal article::research article