Self-paced Mixture of Regressions - Robotics Institute Carnegie Mellon University

Self-paced Mixture of Regressions

Longfei Han, Dingwen Zhang, Dong Huang, Xiaojun Chang, Jun Ren, Senlin Luo, and Junwei Han
Conference Paper, Proceedings of 26th International Joint Conference on Artificial Intelligence (IJCAI '17), pp. 1816 - 1822, August, 2017

Abstract

Mixture of regressions (MoR) is the wellestablished and effective approach to model discontinuous and heterogeneous data in regression problems. Existing MoR approaches assume smooth joint distribution for its good anlaytic properties. However, such assumption makes existing MoR very sensitive to intra-component outliers (the noisy training data residing in certain components) and the inter-component imbalance (the different amounts of training data in different components). In this paper, we make the earliest effort on Self-paced Learning (SPL) in MoR, ie, Self-paced mixture of regressions (SPMoR) model. We propose a novel selfpaced regularizer based on the Exclusive LASSO, which improves inter-component balance of training data. As a robust learning regime, SPL pursues confidence sample reasoning. To demonstrate the effectiveness of SPMoR, we conducted experiments on both the sythetic examples and real-world applications to age estimation and glucose estimation. The results show that SPMoR outperforms the stateof-the-arts methods.

BibTeX

@conference{Han-2017-126373,
author = {Longfei Han and Dingwen Zhang and Dong Huang and Xiaojun Chang and Jun Ren and Senlin Luo and Junwei Han},
title = {Self-paced Mixture of Regressions},
booktitle = {Proceedings of 26th International Joint Conference on Artificial Intelligence (IJCAI '17)},
year = {2017},
month = {August},
pages = {1816 - 1822},
}