Copula-based Kernel Dependency Measures
Abstract
The paper presents a new copula based method for measuring dependence between random variables. Our approach extends the Maximum Mean Discrepancy to the copula of the joint distribution. We prove that this approach has several advantageous properties. Similarly to Shannon mutual information, the proposed dependence measure is invariant to any strictly increasing transformation of the marginal variables. This is important in many applications, for example in feature selection. The estimator is consistent, robust to outliers, and uses rank statistics only. We derive upper bounds on the convergence rate and propose independence tests too. We illustrate the theoretical contributions through a series of experiments in feature selection and low-dimensional embedding of distributions.
BibTeX
@conference{Poczos-2012-119792,author = {B. Poczos and Z. Ghahramani and J. Schneider},
title = {Copula-based Kernel Dependency Measures},
booktitle = {Proceedings of (ICML) International Conference on Machine Learning},
year = {2012},
month = {June},
pages = {1635 - 1642},
}