Texture-Illumination Separation for Single-Shot Structured Light Reconstruction
Abstract
Active illumination based methods have a trade-off between acquisition time and resolution of the estimated 3D shapes. Multi-shot approaches can generate dense reconstructions but require stationary scenes. Single-shot methods are applicable to dynamic objects but can only estimate sparse reconstructions and are sensitive to surface texture. We present a single-shot approach to produce dense shape reconstructions of highly textured objects illuminated by one or more projectors. The key to our approach is an image decomposition scheme that can recover the illumination image of different projectors and the texture images of the scene from their mixed appearances. We focus on three cases of mixed appearances: the illumination from one projector onto textured surface, illumination from multiple projectors onto a textureless surface, or their combined effect. Our method can accurately compute per-pixel warps from the illumination patterns and the texture template to the observed image. The texture template is obtained by interleaving the projection sequence with an all-white pattern. The estimated warps are reliable even with infrequent interleaved projection and strong object deformation. Thus, we obtain detailed shape reconstruction and dense motion tracking of the textured surfaces. The proposed method, implemented using a one camera and two projectors system, is validated on synthetic and real data containing subtle non-rigid surface deformations.
BibTeX
@article{Vo-2016-120195,author = {M. P. Vo and S. G. Narasimhan and Y. Sheikh},
title = {Texture-Illumination Separation for Single-Shot Structured Light Reconstruction},
journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence},
year = {2016},
month = {February},
volume = {38},
number = {2},
pages = {390 - 404},
}