资源论文Estimation of Renyi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs

Estimation of Renyi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs

2020-01-06 | |  55 |   39 |   0

Abstract

We present simple and computationally efficient nonparametric estimators of Renyi entropy and mutual information based on an i.i.d. sample drawn from an unknown, absolutely continuous distribution over 图片.pngThe estimators are calculated as the sum of p-th powers of the Euclidean lengths of the edges of the ‘generalized nearest-neighbor’ graph of the sample and the empirical copula of the sample respectively. For the first time, we prove the almost sure consistency of these estimators and upper bounds on their rates of convergence, the latter of which under the assumption that the density underlying the sample is Lipschitz continuous. Experiments demonstrate their usefulness in independent subspace analysis.

上一篇:Large-Scale Matrix Factorization with Missing Data under Additional Constraints

下一篇:Permutation Complexity Bound on Out-Sample Error

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...