Image Matching in Large Scale Indoor Environment
Abstract
In this paper, we propose a data driven approach to first-person vision. We propose a novel image matching algorithm, named Re-Search, that is designed to cope with self-repetitive structures and confusing patterns in the indoor environment. This algorithm uses state-of-art image search techniques, and it matches a query image with a two-pass strategy. In the first pass, a conventional image search algorithm is used to search for a small number of images that are most similar to the query image. In the second pass, the retrieval results from the first step are used to discover features that are more distinctive in the local context. We demonstrate and evaluate the Re-Search algorithm in the context of indoor localization, with the illustration of potential applications in object pop-out and data-driven zoom-in.
BibTeX
@workshop{Kang-2009-10232,author = {Hongwen Kang and Alexei A. Efros and Martial Hebert and Takeo Kanade},
title = {Image Matching in Large Scale Indoor Environment},
booktitle = {Proceedings of CVPR '09 Workshop on Egocentric Vision},
year = {2009},
month = {June},
pages = {33 - 40},
keywords = {Image matching, large scale, indoor localization, Re-Search, local TF-IDF},
}