Lass0: Sparse Non-Convex Regression by Local Search
Workshop Paper, NeurIPS '15 Workshop on Optimization for Machine Learning, December, 2015
Abstract
We compute approximate solutions to L0 regularized linear regression problems using convex relaxation of L1 regularization, also known as the Lasso, as an initialization step. Our algorithm, the Lass-0 ("Lass-zero"), uses a computationally efficient stepwise search to determine a locally optimal L0 solution given any L1 regularization solution. We present theoretical results of consistency under orthogonality and appropriate handling of redundant features. Empirically, we demonstrate on synthetic data that the Lass-0 solutions are closer to the true sparse support than L1 regularization models. Additionally, in real-world data Lass-0 finds more parsimonious solutions that L1 regularization while maintaining similar predictive accuracy.
BibTeX
@workshop{De-2015-121842,author = {William Herlands and Maria De Arteaga and Daniel Neill and Artur Dubrawski},
title = {Lass0: Sparse Non-Convex Regression by Local Search},
booktitle = {Proceedings of NeurIPS '15 Workshop on Optimization for Machine Learning},
year = {2015},
month = {December},
}
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.