Vision-Based Navigation in Autonomous Close Proximity Operations Using Neural Networks
Tight unmanned aerial vehicle (UAV) autonomous missions such as formation flight (FF) and aerial refueling (AR) require an active controller that works in conjunction with a precise sensor that is able to identify an in-front aircraft and to estimate its relative position and orientation. Among possible choices vision sensors are of interest because they are passive in nature and do not require the cooperation of the in-front aircraft in any way. In this paper new vision-based estimation and navigation algorithms based on neural networks is developed. The accuracy and robustness of the proposed algorithm have been validated via a detailed modeling and a complete virtual environment based on the six degrees of freedom (6-DOF) nonlinear simulation of aircraft dynamics in an autonomous aerial refueling (AAR) mission. In addition a full-state time-variant tracking controller based on the pole placement method is designed to generate required commands for aircraft control surfaces and engine during an AAR. The performance of the system in the presence of noise has also been examined.