资源论文Two-Sided Exponential Concentration Bounds for Bayes Error Rate and Shannon Entropy

Two-Sided Exponential Concentration Bounds for Bayes Error Rate and Shannon Entropy

2020-03-02 | |  84 |   46 |   0

Abstract

We provide a method that approximates the Bayes error rate and the Shannon entropy with high probability. The Bayes error rate approximation makes possible to build a classifier that polynomially approaches Bayes error rate. The Shannon entropy approximation provides provable performance guarantees for learning trees and Bayesian networks from continuous variables. Our results rely on some reasonable regularity conditions of the unknown probability distributions, and apply to bounded as well as unbounded variables.

上一篇:Sparse projections onto the simplex

下一篇:Vanishing Component Analysis

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...