Using Locally Corresponding CAD Models for Dense 3D Reconstructions from a Single Image - Robotics Institute Carnegie Mellon University

Using Locally Corresponding CAD Models for Dense 3D Reconstructions from a Single Image

C. Kong, C. Lin, and S. Lucey
Conference Paper, Proceedings of (CVPR) Computer Vision and Pattern Recognition, pp. 5603 - 5611, July, 2017

Abstract

We investigate the problem of estimating the dense 3D shape of an object, given a set of 2D landmarks and silhouette in a single image. An obvious prior to employ in such a problem is a dictionary of dense CAD models. Employing a sufficiently large enough dictionary of CAD models, however, is in general computationally infeasible. A common strategy in dictionary learning to encourage generalization is to allow for linear combinations of dictionary elements. This too, however, is problematic as most CAD models cannot be readily placed in global dense correspondence. In this paper, we propose a two-step strategy. First, we employ orthogonal matching pursuit to rapidly choose the closest single CAD model in our dictionary to the projected image. Second, we employ a novel graph embedding based on local dense correspondence to allow for sparse linear combinations of CAD models. We validate our framework experimentally in both synthetic and real world scenario and demonstrate the superiority of our approach to both 3D mesh reconstruction and volumetric representation.

BibTeX

@conference{Kong-2017-121037,
author = {C. Kong and C. Lin and S. Lucey},
title = {Using Locally Corresponding CAD Models for Dense 3D Reconstructions from a Single Image},
booktitle = {Proceedings of (CVPR) Computer Vision and Pattern Recognition},
year = {2017},
month = {July},
pages = {5603 - 5611},
}