Pipe Mapping with Monocular Fisheye Imagery
Abstract
We present a vision-based mapping and localiza- tion system for operations in pipes such as those found in Liquified Natural Gas (LNG) production. A forward facing fisheye camera mounted on a prototype robot collects imagery as it is tele-operated through a pipe network. The images are processed offline to estimate camera pose and sparse scene structure where the results can be used to generate 3D renderings of the pipe surface. The method extends state of the art visual odometry and mapping for fisheye systems to incorporate geometric constraints based on prior knowledge of the pipe components into a Sparse Bundle Adjustment framework. These constraints significantly reduce inaccuracies resulting from the limited spatial resolution of the fisheye imagery, limited image texture, and visual aliasing. Preliminary results are presented for datasets collected in our fiberglass pipe network which demonstrate the validity of the approach.
BibTeX
@conference{Hansen-2013-7791,author = {Peter Hansen and Hatem Said Alismail and Peter Rander and Brett Browning},
title = {Pipe Mapping with Monocular Fisheye Imagery},
booktitle = {Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems},
year = {2013},
month = {November},
pages = {5180 - 5185},
keywords = {pipe mapping, visual SLAM, SLAM, fisheye, pipe SLAM},
}