Toward Image-Based Scene Representation Using View Morphing - Robotics Institute Carnegie Mellon University

Toward Image-Based Scene Representation Using View Morphing

Steven Seitz and C. R. Dyer
Conference Paper, Proceedings of 13th International Conference on Pattern Recognition (ICPR '96), pp. 84 - 89, August, 1996

Abstract

The question of which views may be inferred from a set of basis images is addressed. Under certain conditions, a discrete set of images implicitly describes scene appearance for a continuous range of viewpoints. In particular, it is demonstrated that two basis views of a static scene determine the set of all views on the line between their optical centers. Additional basis views further extend the range of predictable views to a two- or three-dimensional region of viewspace. These results are shown to apply under perspective projection subject to a generic visibility constraint called monotonicity. In addition, a simple scanline algorithm is presented for actually generating these views from a set of basis images. The technique, called view morphing may be applied to both calibrated and uncalibrated images. At a minimum, two basis views and their fundamental matrix are needed. Experimental results are presented on real images. This work provides a theoretical foundation for image-based representations of 3D scenes by demonstrating that perspective view synthesis is a theoretically well-posed problem.

BibTeX

@conference{Seitz-1996-16319,
author = {Steven Seitz and C. R. Dyer},
title = {Toward Image-Based Scene Representation Using View Morphing},
booktitle = {Proceedings of 13th International Conference on Pattern Recognition (ICPR '96)},
year = {1996},
month = {August},
pages = {84 - 89},
}