资源论文Finite-Sample Analysis of Fixed-k Nearest Neighbor Density Functional Estimators

Finite-Sample Analysis of Fixed-k Nearest Neighbor Density Functional Estimators

2020-02-05 | |  75 |   48 |   0

Abstract

 We provide finite-sample analysis of a general framework for using k-nearest neighbor statistics to estimate functionals of a nonparametric continuous probability density, including entropies and divergences. Rather than plugging a consistent density estimate (which requires k image.png as the sample size n image.png) into the functional of interest, the estimators we consider fix k and perform a bias correction. This is more efficient computationally, and, as we show in certain cases, statistically, leading to faster convergence rates. Our framework unifies several previous estimators, for most of which ours are the first finite sample guarantees.

上一篇:Kernel Observers: Systems-Theoretic Modeling and Inference of Spatiotemporally Evolving Processes

下一篇:Long-term causal effects via behavioral game theory

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...