Event-based, Direct Camera Tracking from a Photometric 3D Map using Nonlinear Optimization

Event cameras are novel bio-inspired vision sensors that output pixel-level intensity changes, called “events”, instead of traditional video images. These asynchronous sensors naturally respond to motion in the scene with very low latency (in the order of microseconds) and have a very high dynamic range. These features, along with a very low power consumption, make event cameras an ideal sensor for fast robot localization and wearable applications, such as AR/VR and gaming. Considering these applications, we present a method to track the 6-DOF pose of an event camera in a known environment, which we contemplate to be described by a photometric 3D map (i.e., intensity plus depth information) built via classic dense 3D reconstruction algorithms. Our approach uses the raw events, directly, without intermediate features, within a maximum-likelihood framework to estimate the camera motion that best explains the events via a generative model. We successfully evaluate the method using both simulated and real data, and show improved results over the state of the art. We release the datasets to the public to foster reproducibility and research in this topic.

Presented at:
IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada, May 20-24, 2019
url: http://rpg.ifi.uzh.ch/direct_event_camera_tracking/
Additional link:

 Record created 2019-06-04, last modified 2019-08-12

Rate this document:

Rate this document:
(Not yet reviewed)