Information Sparsification in Visual-Inertial Odometry
Abstract
In this paper, we present a novel approach to tightly couple visual and inertial measurements in a fixed-lag visual-inertial odometry (VIO) framework using information sparsification. To bound computational complexity, fixed-lag smoothers typically marginalize out variables, but consequently introduce a densely connected linear prior which significantly deteriorates accuracy and efficiency. Current state-of-the-art approaches account for the issue by selectively discarding measurements and marginalizing additional variables. However, such strategies are sub-optimal from an information- theoretic perspective. Instead, our approach performs a dense marginalization step and preserves the information content of the dense prior. Our method sparsifies the dense prior with a nonlinear factor graph by minimizing the information loss. The resulting factor graph maintains information sparsity, structural similarity, and nonlinearity. To validate our approach, we conduct real-time drone tests and perform comparisons to current state-of-the-art fixed-lag VIO methods in the EuRoC visual-inertial dataset. The experimental results show that the proposed method achieves competitive and superior accuracy in almost all trials. We include a detailed run-time analysis to demonstrate that the proposed algorithm is suitable for real-time applications.
BibTeX
@conference{Hsiung-2018-107713,author = {Jerry Hsiung and Ming Hsiao and Eric Westman and Rafael Valencia and Michael Kaess},
title = {Information Sparsification in Visual-Inertial Odometry},
booktitle = {Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems},
year = {2018},
month = {October},
pages = {1146 - 1153},
}