Designing a Metric for the Difference between Gaussian Densities - Robotics Institute Carnegie Mellon University

Designing a Metric for the Difference between Gaussian Densities

K. T. Abou-Moustafa, F. De la Torre, and F. P. Ferrie
Conference Paper, Proceedings of International Symposium on the Occasion of the 25th Anniversary of the McGill University Centre for Intelligent Machines, pp. 57 - 70, December, 2010

Abstract

Measuring the difference between two multivariate Gaussians is central to statistics and machine learning. Traditional measures based on the Bhattacharyya coefficient or the symmetric Kullback-Leibler divergence do not satisfy metric properties necessary for many algorithms. This paper proposes a metric for Gaussian densities. Similar to the Bhattacharyya distance and the symmetric Kullback-Leibler divergence, the proposed metric reduces the difference between two Gaussians to the difference between their parameters. Based on the proposed metric we introduce a symmetric and positive semi-definite kernel between Gaussian densities. We illustrate the benefits of the proposed metric in two settings: (1) a supervised problem, where we learn a low-dimensional projection that maximizes the distance between Gaussians, and (2) an unsupervised problem on spectral clustering where the similarity between samples is measured with our proposed kernel.

BibTeX

@conference{Abou-Moustafa-2010-120925,
author = {K. T. Abou-Moustafa and F. De la Torre and F. P. Ferrie},
title = {Designing a Metric for the Difference between Gaussian Densities},
booktitle = {Proceedings of International Symposium on the Occasion of the 25th Anniversary of the McGill University Centre for Intelligent Machines},
year = {2010},
month = {December},
pages = {57 - 70},
}