Orbital SLAM
Abstract
This paper demonstrates infrastructure-free orbital Simultaneous Localization and Mapping (SLAM). Individual surface landmarks are tracked through images taken in orbit and the filter receives measurements of these landmarks in the form of bearing angles. The filter then updates the spacecraft's position and velocity as well as landmark locations, thus building a map of the orbited body. In contrast to other approaches that use an IMU, which doesn't work in orbit, to resolve scale, the contribution of this paper is to demonstrate that scale can be resolved using orbital dynamics. Radio localization can be replaced with onboard localization, enabling truly autonomous missions to both under-mapped and unmapped planetary bodies. Overall system convergence is shown by simulating landmark detection from an orbit of the Clementine Mission on a Moon model constructed using Lunar Reconnaissance Orbiter (LRO) digital elevation data in conjunction with the filter. The techniques developed in this work demonstrate that when combined with a gravity model, visual SLAM converges to a full scale solution.
BibTeX
@conference{Vassallo-2015-122453,author = {Corinne Vassallo and Wennie Tabib and Kevin Peterson},
title = {Orbital SLAM},
booktitle = {Proceedings of 12th Conference on Computer and Robot Vision (CRV '15)},
year = {2015},
month = {June},
pages = {305 - 312},
}