Inertial and 3D-odometry fusion in rough terrain Towards real 3D navigation
Many algorithms related to localization need good pose prediction in order to produce accurate results. This is especially the case for data association algorithms, where false feature matches can lead to the localization system failure. In rough terrain, the field of view can vary significantly between two feature extraction steps, so a good position prediction is necessary to robustly track features. This paper presents a method for combining dead reckoning sensor information in order to provide an initial estimate of the six degrees of freedom of a rough terrain rover. An inertial navigation system (INS) and the wheel encoders are used as sensory inputs. The sensor fusion scheme is based on an extended information filter (EIF) and is extensible to any kind and number of sensors. In order to test the system, the rover has been driven on different kind of obstacles while computing both pure 3D-odometric and fused INS/3D-odometry trajectories. The results show that the use of the INS significantly improves the pose prediction.