Abstract:
Robust odometry is at the core of robotics and autonomous systems operating navigation, exploration, and locomotion in complex environments for a broad spectrum of applications. While great progress has been made, the robustness of the odometry system still remains a grand challenge. This talk introduces Super Odometry, an approach that leverages selective fusion to overcome ambiguous measurements in degraded environments such as low-light, long corridors, snow, etc, and provides a very robust SLAM solution.
Super Odometry, a high-precision multi-modal sensor fusion framework, provides a simple but effective way to fuse multiple sensors such as LiDAR, camera, and IMU sensors and achieve robust state estimation in perceptually-degraded environments. Different from previous methods, Super Odometry employs a selective fusion processing pipeline, which combines the advantages of loosely coupled methods with tightly coupled methods and recovers motion in a coarse-to-fine manner. We also perform observability analysis in the correspondence and propose hierarchical selective fusion strategies to seamlessly integrate information from features, parameters, states, and engines. Super Odometry has been thoroughly tested with an aggressive motion for an accumulated trajectory of 200 km and 500 hours for a fleet of heterogeneous platforms including aerial, wheeled, and legged robots. Feel free to find more details on our project website: https://superodometry.com/
Committee:
Sebastian Scherer, Chair
Michael Kaess
Shubham Tulsiani
Dan McGann