Relaxed Exponential Kernels for Unsupervised Learning - Robotics Institute Carnegie Mellon University

Relaxed Exponential Kernels for Unsupervised Learning

K. T. Abou-Moustafa, M. Shah, F. De la Torre, and F. P. Ferrie
Conference Paper, Proceedings of 33rd Annual Symposium of the German Association for Pattern Recognition (DAGM '11), pp. 184 - 195, August, 2011

Abstract

Many unsupervised learning algorithms make use of kernels that rely on the Euclidean distance between two samples. However, the Euclidean distance is optimal for Gaussian distributed data. In this paper, we relax the global Gaussian assumption made by the Euclidean distance, and propose a locale Gaussian modelling for the immediate neighbourhood of the samples, resulting in an augmented data space formed by the parameters of the local Gaussians. To this end, we propose a convolution kernel for the augmented data space. The factorisable nature of this kernel allows us to introduce (semi)-metrics for this space, which further derives relaxed versions of known kernels for this space. We present empirical results to validate the utility of the proposed localized approach in the context of spectral clustering. The key result of this paper is that this approach that combines the local Gaussian model with measures that adhere to metric properties, yields much better performance in different spectral clustering tasks.

BibTeX

@conference{Abou-Moustafa-2011-120915,
author = {K. T. Abou-Moustafa and M. Shah and F. De la Torre and F. P. Ferrie},
title = {Relaxed Exponential Kernels for Unsupervised Learning},
booktitle = {Proceedings of 33rd Annual Symposium of the German Association for Pattern Recognition (DAGM '11)},
year = {2011},
month = {August},
pages = {184 - 195},
}