Building 3D Mosaics from an Autonomous Underwater Vehicle, Doppler Velocity Log, and 2D Imaging Sonar
Abstract
This paper reports on a 3D photomosaicing pipeline using data collected from an autonomous underwa- ter vehicle performing simultaneous localization and mapping (SLAM). The pipeline projects and blends 2D imaging sonar data onto a large-scale 3D mesh that is either given a priori or derived from SLAM. Compared to other methods that generate a 2D-only mosaic, our approach produces 3D models that are more structurally representative of the environment being surveyed. Additionally, our system leverages recent work in underwater SLAM using sparse point clouds derived from Doppler velocity log range returns to relax the need for a prior model. We show that the method produces reasonably accurate surface reconstruction and blending consistency, with and without the use of a prior mesh. We experimentally evaluate our approach with a Hovering Autonomous Underwater Vehicle (HAUV) performing inspection of a large underwater ship hull.
BibTeX
@conference{Ozog-2015-5946,author = {Paul Ozog and Giancarlo Troni and Michael Kaess and Ryan M. Eustice and Matthew Johnson-Roberson},
title = {Building 3D Mosaics from an Autonomous Underwater Vehicle, Doppler Velocity Log, and 2D Imaging Sonar},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2015},
month = {May},
pages = {1137 - 1143},
}