Lass0: Sparse Non-Convex Regression by Local Search - Robotics Institute Carnegie Mellon University

Lass0: Sparse Non-Convex Regression by Local Search

William Herlands, Maria De Arteaga, Daniel Neill, and Artur Dubrawski
Workshop Paper, NeurIPS '15 Workshop on Optimization for Machine Learning, December, 2015

Abstract

We compute approximate solutions to L0 regularized linear regression problems using convex relaxation of L1 regularization, also known as the Lasso, as an initialization step. Our algorithm, the Lass-0 ("Lass-zero"), uses a computationally efficient stepwise search to determine a locally optimal L0 solution given any L1 regularization solution. We present theoretical results of consistency under orthogonality and appropriate handling of redundant features. Empirically, we demonstrate on synthetic data that the Lass-0 solutions are closer to the true sparse support than L1 regularization models. Additionally, in real-world data Lass-0 finds more parsimonious solutions that L1 regularization while maintaining similar predictive accuracy.

BibTeX

@workshop{De-2015-121842,
author = {William Herlands and Maria De Arteaga and Daniel Neill and Artur Dubrawski},
title = {Lass0: Sparse Non-Convex Regression by Local Search},
booktitle = {Proceedings of NeurIPS '15 Workshop on Optimization for Machine Learning},
year = {2015},
month = {December},
}