资源论文Nonparametric von Mises Estimators for Entropies, Divergences and Mutual Informations

Nonparametric von Mises Estimators for Entropies, Divergences and Mutual Informations

2020-02-04 | |  134 |   47 |   0

Abstract

 We propose and analyse estimators for statistical functionals of one or more distributions under nonparametric assumptions. Our estimators are derived from the von Mises expansion and are based on the theory of influence functions, which appear in the semiparametric statistics literature. We show that estimators based either on data-splitting or a leave-one-out technique enjoy fast rates of convergence and other favorable theoretical properties. We apply this framework to derive estimators for several popular information theoretic quantities, and via empirical evaluation, show the advantage of this approach over existing estimators.

上一篇:Galileo: Perceiving Physical Object Properties by Integrating a Physics Engine with Deep Learning

下一篇:Parallel Predictive Entropy Search for Batch Global Optimization of Expensive Objective Functions

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...