Nonparametric risk and stability analysis for multi-task learning problems - Robotics Institute Carnegie Mellon University

Nonparametric risk and stability analysis for multi-task learning problems

X. Wang, J. Oliva, J. Schneider, and B. Poczos
Conference Paper, Proceedings of 25th International Joint Conference on Artificial Intelligence (IJCAI '16), pp. 2146 - 2152, July, 2016

Abstract

Multi-task learning attempts to simultaneously leverage data from multiple domains in order to estimate related functions on each domain. For example, a special case of multi-task learning, transfer learning, is often employed when one has a good estimate of a function on a source domain, but is unable to estimate a related function well on a target domain using only target data. Multitask/ transfer learning problems are usually solved by imposing some kind of "smooth" relationship among/between tasks. In this paper, we study how different smoothness assumptions on task relations affect the upper bounds of algorithms proposed for these problems under different settings. For general multi-task learning, we study a family of algorithms which utilize a reweighting matrix on task weights to capture the smooth relationship among tasks, which has many instantiations in existing literature. Furthermore, for multi-task learning in a transfer learning framework, we study the recently proposed algorithms for the "model shift", where the conditional distribution P(Y|X) is allowed to change across tasks but the change is assumed to be smooth. In addition, we illustrate our results with experiments on both simulated and real data.

BibTeX

@conference{Wang-2016-119755,
author = {X. Wang and J. Oliva and J. Schneider and B. Poczos},
title = {Nonparametric risk and stability analysis for multi-task learning problems},
booktitle = {Proceedings of 25th International Joint Conference on Artificial Intelligence (IJCAI '16)},
year = {2016},
month = {July},
pages = {2146 - 2152},
}