This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.

An information-theoretic approach to statistical dependence: Copula information

and

Published 22 December 2009 Europhysics Letters Association
, , Citation R. S. Calsaverini and R. Vicente 2009 EPL 88 68003 DOI 10.1209/0295-5075/88/68003

0295-5075/88/6/68003

Abstract

We discuss the connection between information and copula theories by showing that a copula can be employed to decompose the information content of a multivariate distribution into marginal and dependence components, with the latter quantified by the mutual information. We define the information excess as a measure of deviation from a maximum-entropy distribution. The idea of marginal invariant dependence measures is also discussed and used to show that empirical linear correlation underestimates the amplitude of the actual correlation in the case of non-Gaussian marginals. The mutual information is shown to provide an upper bound for the asymptotic empirical log-likelihood of a copula. An analytical expression for the information excess of T-copulas is provided, allowing for simple model identification within this family. We illustrate the framework in a financial data set.

Export citation and abstract BibTeX RIS