Learning from Small Sample Sets by Combining Unsupervised Meta-Training with CNNs - Robotics Institute Carnegie Mellon University

Learning from Small Sample Sets by Combining Unsupervised Meta-Training with CNNs

Conference Paper, Proceedings of (NeurIPS) Neural Information Processing Systems, pp. 244 - 252, December, 2016

Abstract

This work explores CNNs for the recognition of novel categories from few examples. Inspired by the transferability properties of CNNs, we introduce an additional unsupervised meta-training stage that exposes multiple top layer units to a large amount of unlabeled real-world images. By encouraging these units to learn diverse sets of low-density separators across the unlabeled data, we capture a more generic, richer description of the visual world, which decouples these units from ties to a specific set of categories. We propose an unsupervised margin maximization that jointly estimates compact high-density regions and infers low-density separators. The low-density separator (LDS) modules can be plugged into any or all of the top layers of a standard CNN architecture. The resulting CNNs significantly improve the performance in scene classification, fine-grained recognition, and action recognition with small training samples.

BibTeX

@conference{Wang-2016-26050,
author = {Yuxiong Wang and Martial Hebert},
title = {Learning from Small Sample Sets by Combining Unsupervised Meta-Training with CNNs},
booktitle = {Proceedings of (NeurIPS) Neural Information Processing Systems},
year = {2016},
month = {December},
pages = {244 - 252},
}