000199738 001__ 199738
000199738 005__ 20190812205801.0
000199738 037__ $$aCONF
000199738 245__ $$aLow-Latency Event-Based Visual Odometry
000199738 269__ $$a2014
000199738 260__ $$c2014
000199738 336__ $$aConference Papers
000199738 520__ $$aThe agility of a robotic system is ultimately limited by the speed of its processing pipeline. The use of a Dynamic Vision Sensors (DVS), a sensor producing asynchronous events as luminance changes are perceived by its pixels, makes it possible to have a sensing pipeline of a theoretical latency of a few microseconds. However, several challenges must be overcome: a DVS does not provide the grayscale value but only changes in the luminance; and because the output is composed by a sequence of events, traditional frame-based visual odometry methods are not applicable. This paper presents the first visual odometry system based on a DVS plus a normal CMOS camera to provide the absolute brightness values. The two sources of data are automatically spatiotemporally calibrated from logs taken during normal operation. We design a visual odometry method that uses the DVS events to estimate the relative displacement since the previous CMOS frame by processing each event individually. Experiments show that the rotation can be estimated with surprising accuracy, while the translation can be estimated only very noisily, because it produces few events due to very small apparent motion.
000199738 700__ $$aCensi, Andrea
000199738 700__ $$aScaramuzza, Davide
000199738 7112_ $$dMay 31 - June 7, 2014$$cHong Kong, China$$a2014 IEEE International Conference on Robotics and Automation (ICRA 2014)
000199738 8564_ $$zn/a$$yn/a$$uhttps://infoscience.epfl.ch/record/199738/files/ICRA14_Censi.pdf$$s3912998
000199738 909C0 $$xU12367$$pNCCR-ROBOTICS$$0252409
000199738 909CO $$qGLOBAL_SET$$pconf$$ooai:infoscience.tind.io:199738
000199738 917Z8 $$x221818
000199738 937__ $$aEPFL-CONF-199738
000199738 973__ $$rREVIEWED$$sPUBLISHED$$aOTHER
000199738 980__ $$aCONF