Lidar point-to-point correspondences for rigorous registration of kinematic scanning in dynamic networks
With the objective of improving the registration of lidar point clouds produced by kinematic scanning systems, we propose a novel trajectory adjustment procedure that leverages on the automated extraction of selected reliable 3D point–to–point correspondences between overlapping point clouds and their joint integration (adjustment) together with raw inertial and GNSS observations. This is performed in a tightly coupled fashion using a dynamic network approach that results in an optimally compensated trajectory through modeling of errors at the sensor, rather than the trajectory, level. The 3D correspondences are formulated as static conditions within the dynamic network and the registered point cloud is generated with significantly higher accuracy based on the corrected trajectory and possibly other parameters determined within the adjustment. We first describe the method for selecting correspondences and how they are inserted into the dynamic network via new observation model while providing an open-source implementation of the solver employed in this work. We then describe the experiments conducted to evaluate the performance of the proposed framework in practical airborne laser scanning scenarios with low-cost MEMS inertial sensors. In the conducted experiments, the method proposed to establish 3D correspondences is effective in determining point–to–point matches across a wide range of geometries such as trees, buildings and cars. Our results demonstrate that the method improves the point cloud registration accuracy (~5x in nominal and ~10x in emulated GNSS outage conditions within the studied cases), which is otherwise strongly affected by errors in the determined platform attitude or position, and possibly determine unknown boresight angles. The proposed methods remain effective even if only a fraction (~0.1%) of the total number of established 3D correspondences are considered in the adjustment.
1-s2.0-S0924271622001307-main.pdf
publisher
openaccess
CC BY
5.38 MB
Adobe PDF
1df9a10ff6a15bc747f996f425032e77