Online Lidar and Vision based Ego-motion Estimation and Mapping - Robotics Institute Carnegie Mellon University
Loading Events

PhD Thesis Defense

February

7
Tue
Tuesday, February 7
9:30 am to 9:30 am
GHC 4405
Online Lidar and Vision based Ego-motion Estimation and Mapping

Ji Zhang
Carnegie Mellon University

Abstract

In many real-world applications, ego-motion estimation and mapping must be conducted online. In the robotics world, especially, real-time motion estimates are important for control of autonomous vehicles, while online generated maps are crucial for obstacle avoidance and path planning. Further, the complete map of a traversed environment can be taken as an input for further processing such as scene segmentation, 3D reasoning, and virtual reality.

To date, fusing a large amount of data from a variety of sensors in real-time remains a nontrivial problem. The problem is particularly hard if is to be solved in 3D, accurately, robustly, and in a small form factor. This thesis proposes to tackle the problem by leveraging range, vision, and inertial sensing in a coarse-to-fine manor, through multi-layer processing. In a modularized processing pipeline, modules taking light computation execute at high frequencies to gain robustness w.r.t. high-rate, rapid motion. Modules consuming heavy processing run at low frequencies to ensure accuracy in resulting motion estimates and maps.

Further, the modularized processing pipeline is capable of handling sensor degradation by automatic reconfiguration bypassing failure modules. Vision-based methods typically fail in low-light or texture-less scenes. Likewise, lidar-based methods are problematic in symmetric or extruded environments such as a long and straight corridor. When such degradation occurs, the proposed pipeline automatically determines a degraded subspace in the problem state space, and solves the problem partially in the well-conditioned subspace. Consequently, the final solution is formed by combination of the “healthy” parts from each module.

The proposed ego-motion estimation and mapping methods have been validated in extensive experiments ranging from car-mounted, hand-carried, to drone-attached setups. Experiments are conducted in various environments covering structured urban areas as well as unstructured natural scenes. Results indicate that the methods can carry out high-precision estimation over a long distance of travel as well as robustness w.r.t. high-speed, aggressive motion and environmental degradation.

Thesis Committee

Sanjiv Singh, Chair
Martial Hebert
Michael Kaess
Larry Matthies, Jet Propulsion Laboratory