DROAN: Disparity space Representation for Obstacle AvoidaNce
Abstract
Agile Micro-Aerial-Vehicles or MAVs are required to operate in cluttered, un- structured environments at high speeds and low altitudes for efficient data gathering. Given the payload constraints and long range sensing requirements, cameras are the preferred sensing modality for MAVs.
However, contemporary approaches use stereo camera observations in 3D space by converting disparity image to point cloud and fail to deal with excess sensor noise at long ranges and often resort to using less noisy short range observations. The computation burden of using rich information provided by cameras and difficulty to deal with sensor error for obstacle sensing has forced the state of the art methods to construct world representations on a per frame basis, leading to myopic decision making.
In this thesis we propose a long range perception and planning approach using stereo cameras. We propose a method to use inverse-depth based obstacle represen- tation which is adept at using the information provided by stereo cameras and enable to incorporate sensor noise model for probabilistic occupancy inference. Using 2D inverse-depth or disparity images based obstacle representation, our method enables computationally efficient, on-demand occupancy inference.
By utilizing FPGA hardware for disparity calculation and image space to repre- sent obstacles, our approach and system design allows for construction of long term world representation whilst accounting for highly non-linear noise models in real time.
We demonstrate these obstacle avoidance capabilities on a quad-rotor flying through dense foliage at speeds of upto 10m/s for a total of 1.6 hours of autonomous flights. The presented approach enables high speed navigation at low altitudes for MAVs for terrestrial scouting.
BibTeX
@mastersthesis{Dubey-2017-27178,author = {Geetesh Dubey},
title = {DROAN: Disparity space Representation for Obstacle AvoidaNce},
year = {2017},
month = {August},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-17-42},
keywords = {Collision Avoidance, Unmanned Aerial Systems, Micro Aerial Systems, Percep- tion and Autonomy, Sensor Fusion, Obstacle Avoidance, Visual Navigation, Occupancy Map- ping, Stereo Vision},
}