Infoscience

Conference paper

Correction Of Airborne Pushbroom Images Orientation Using Bundle Adjustment Of Frame Images

To compute hyperspectral orthophotos of an area, one may proceed like for standard RGB orthophotos : equip an aircraft or a drone with the appropriate camera, a GPS and an Inertial Measurement Unit (IMU). The position and attitude data from the navigation sensors, together with the collected images, can be input to a bundle adjustment which refines the estimation of the parameters and allows to create 3D models or orthophotos of the scene. But most of the hyperspectral cameras are pushbrooms sensors : they acquire lines of pixels. The bundle adjustment identifies tie points (using their 2D neighbourhoods) between different images to stitch them together. This is impossible when the input images are lines. To get around this problem, we propose a method that can be used when both a frame RGB camera and a hyperspectral pushbroom camera are used during the same flight. We first use the bundle adjustment theory to obtain corrected navigation parameters for the RGB camera. Then, assuming a small boresight between the RGB camera and the navigation sensors, we can estimate this boresight as well as the corrected position and attitude parameters for the navigation sensors. Finally, supposing that the boresight between these sensors and the pushbroom camera is constant during the flight, we can retrieve it by matching manually corresponding pairs of points between the current projection and a reference. Comparison between the direct georeferencing and the georeferencing with our method on three flights performed during the Leman-Baikal project shows great improvement of the ground accuracy.

Fulltext

Related material