Autonomous Flight in GPS-Denied Environments Using Monocular Vision and Inertial Sensors
Abstract
A vision-aided inertial navigation system that enables autonomous flight of an aerial vehicle in GPS-denied environments is presented. Particularly, feature point information from a monocular vision sensor are used to bound the drift resulting from integrating accelerations and angular rate measurements from an Inertial Measurement Unit (IMU) forward in time. An Extended Kalman filter framework is proposed for performing the tasks of vision-based mapping and navigation separately. When GPS is available, multiple observations of a single landmark point from the vision sensor are used to estimate the point’s location in inertial space. When GPS is not available, points that have been sufficiently mapped out can be used for estimating vehicle position and attitude. Simulation and flight test results of a vehicle operating autonomously in a simplified loss-of-GPS scenario verify the presented method.
BibTeX
@article{Wu-2013-7686,author = {A. D. Wu and E. N. Johnson and Michael Kaess and F. Dellaert and G. Chowdhary},
title = {Autonomous Flight in GPS-Denied Environments Using Monocular Vision and Inertial Sensors},
journal = {Journal of Aerospace Information Systems},
year = {2013},
month = {April},
volume = {10},
number = {4},
pages = {172 - 186},
}