State Estimation and Vision-based Occupancy Mapping for Off-Road Driving
Abstract
Autonomous navigation in off-road driving scenarios requires accurate and reliable vehicle localization. At the same time, planning for vehicles traversing rough terrain and cluttered areas necessitates the need of efficient and scalable mapping on the surrounding environment. This thesis explores methods of efficient localization and mapping for a self-driving all-terrain vehicle in off-road environment. We present a generalized extended Kalman filtering approach for estimating vehicle state globally and locally with minimal sensor fusion. We then investigate the capability of a state-of-the-art visual SLAM method using single stereo camera in off-road driving cases. Finally, we propose a 3D occupancy mapping framework using stereo vision and integration of visual SLAM. We evaluate our approaches with real-time driving tests and experiments on the publicly available dataset as well as our own off-road dataset from the test field.
BibTeX
@mastersthesis{Chou-2017-27175,author = {Wei-Hsin Chou},
title = {State Estimation and Vision-based Occupancy Mapping for Off-Road Driving},
year = {2017},
month = {August},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-17-58},
keywords = {Off-Road, State Estimation, EKF, visual SLAM, Occupancy Mapping},
}