On the Error of Random Fourier Features - Robotics Institute Carnegie Mellon University

On the Error of Random Fourier Features

D. Sutherland and J. Schneider
Conference Paper, Proceedings of 31st Conference on Uncertainty in Artificial Intelligence (UAI '15), pp. 862 - 871, July, 2015

Abstract

Kernel methods give powerful, flexible, and theoretically grounded approaches to solving many problems in machine learning. The standard approach, however, requires pairwise evaluations of a kernel function, which can lead to scalability issues for very large datasets. Rahimi and Recht (2007) suggested a popular approach to handling this problem, known as random Fourier features. The quality of this approximation, however, is not well understood. We improve the uniform error bound of that paper, as well as giving novel understandings of the embedding's variance, approximation error, and use in some machine learning methods. We also point out that surprisingly, of the two main variants of those features, the more widely used is strictly higher-variance for the Gaussian kernel and has worse bounds.

BibTeX

@conference{Sutherland-2015-119779,
author = {D. Sutherland and J. Schneider},
title = {On the Error of Random Fourier Features},
booktitle = {Proceedings of 31st Conference on Uncertainty in Artificial Intelligence (UAI '15)},
year = {2015},
month = {July},
pages = {862 - 871},
}