Beyond Point Clouds: Fisher Information Field for Active Visual Localization

For mobile robots to localize robustly, actively considering the perception requirement at the planning stage is essential. In this paper, we propose a novel representation for active visual localization. By formulating the Fisher information and sensor visibility carefully, we are able to summarize the localization information into a discrete grid, namely the Fisher information field. The information for arbitrary poses can then be computed from the field inconstant time, without the need of costly iterating all the 3D landmarks. Experimental results on simulated and real-world data show the great potential of our method in efficient active localization and perception-aware planning. To benefit related research, we release our implementation of the information field to the public.


Presented at:
IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada, May 20-24, 2019
Year:
2019
Additional link:
Laboratories:




 Record created 2019-06-04, last modified 2019-08-12


Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)