Flexible Transfer Learning under Support and Model Shift - Robotics Institute Carnegie Mellon University

Flexible Transfer Learning under Support and Model Shift

X. Wang and J. Schneider
Conference Paper, Proceedings of (NeurIPS) Neural Information Processing Systems, Vol. 2, pp. 1898 - 1906, December, 2014

Abstract

Transfer learning algorithms are used when one has sufficient training data for one supervised learning task (the source/training domain) but only very limited training data for a second task (the target/test domain) that is similar but not identical to the first. Previous work on transfer learning has focused on relatively restricted settings, where specific parts of the model are considered to be carried over between tasks. Recent work on covariate shift focuses on matching the marginal distributions on observations X across domains. Similarly, work on target/conditional shift focuses on matching marginal distributions on labels Y and adjusting conditional distributions P(X|Y), such that P(X) can be matched across domains. However, covariate shift assumes that the support of test P(X) is contained in the support of training P(X), i.e., the training set is richer than the test set. Target/conditional shift makes a similar assumption for P(Y). Moreover, not much work on transfer learning has considered the case when a few labels in the test domain are available. Also little work has been done when all marginal and conditional distributions are allowed to change while the changes are smooth. In this paper, we consider a general case where both the support and the model change across domains. We transform both X and Y by a location-scale shift to achieve transfer between tasks. Since we allow more flexible transformations, the proposed method yields better results on both synthetic data and real-world data.

BibTeX

@conference{Wang-2014-119781,
author = {X. Wang and J. Schneider},
title = {Flexible Transfer Learning under Support and Model Shift},
booktitle = {Proceedings of (NeurIPS) Neural Information Processing Systems},
year = {2014},
month = {December},
volume = {2},
pages = {1898 - 1906},
}