Kernel Conjugate Gradient
Tech. Report, CMU-RI-TR-05-30, Robotics Institute, Carnegie Mellon University, June, 2005
Abstract
We propose a novel variant of conjugate gradient based on the Reproducing Kernel Hilbert Space (RKHS) inner product. An analysis of the algorithm suggests it enjoys better performance properties than standard iterative methods when applied to learning kernel machines. Experimental results for both classification and regression bear out the theoretical implications. We further address the dominant cost of the algorithm by reducing the complexity of RKHS function evaluations and inner products through the use of space-partitioning tree data-structures.
BibTeX
@techreport{Ratliff-2005-9194,author = {Nathan Ratliff and J. Andrew (Drew) Bagnell},
title = {Kernel Conjugate Gradient},
year = {2005},
month = {June},
institute = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-05-30},
keywords = {kernel methods, functional gradient, reproducing kernel hilbert spaces, conjugate gradient, kernel logistic regression, regularized least squares, gaussian processes, kd-trees},
}
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.