UWStereoNet: Unsupervised Learning for Depth Estimation and Color Correction of Underwater Stereo Imagery
Abstract
Stereo cameras are widely used for sensing and navigation of underwater robotic systems. They can provide high resolution color views of a scene; the constrained camera geometry enables metrically accurate depth estimation; they are also relatively cost-effective. Traditional stereo vision algorithms rely on feature detection and matching to enable triangulation of points for estimating disparity. However, for underwater applications, the effects of underwater light propagation lead to image degradation, reducing image quality and contrast. This makes it especially challenging to detect and match features, especially from varying viewpoints. Recently, deep learning has shown success in end-to-end learning of dense disparity maps from stereo images. Still, many state-of-the-art methods are supervised and require ground truth depth or disparity, which is challenging to gather in subsea environments. Simultaneously, deep learning has also been applied to the problem of underwater image restoration. Again, it is difficult or impossible to gather real ground truth data for this problem. In this work, we present an unsupervised deep neural network (DNN) that takes input raw color underwater stereo imagery and outputs dense depth maps and color corrected imagery of underwater scenes. We leverage a model of the process of underwater image formation, image processing techniques, as well as the geometric constraints inherent to the stereo vision problem to develop a modular network that outperforms existing methods.
BibTeX
@conference{Skinner-2019-130147,author = {Katherine A. Skinner and Junming Zhang and Elizabeth A. Olson and Matthew Johnson-Roberson},
title = {UWStereoNet: Unsupervised Learning for Depth Estimation and Color Correction of Underwater Stereo Imagery},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2019},
month = {May},
pages = {7947 - 7954},
}