资源论文PAC-Bayes Un-Expected Bernstein Inequality

PAC-Bayes Un-Expected Bernstein Inequality

2020-02-23 | |  81 |   88 |   0

Abstract

present a new PAC-Bayesian generalization bound. Standard bounds contain a 图片.png complexity term which dominates unless 图片.png , the empirical error of the learning algorithm’s randomized predictions, vanishes. We manage to replace 图片.png by a term which vanishes in many more situations, essentially whenever the employed learning algorithm is sufficiently stable on the dataset at hand. Our new bound consistently beats state-of-the-art bounds both on a toy example and on UCI datasets (with large enough n). Theoretically, unlike existing bounds, our new bound can be expected to converge to 0 faster whenever a Bernstein/Tsybakov condition holds, thus connecting PAC-Bayesian generalization and excess risk bounds—for the latter it has long been known that faster convergence can be obtained under Bernstein conditions. Our main technical tool is a new concentration inequality which is like Bernstein’s but with 图片.png taken outside its expectation.

上一篇:Deep imitation learning for molecular inverse problems

下一篇:Correlation in Extensive-Form Games: Saddle-Point Formulation and Benchmarks?

用户评价
全部评价

热门资源

  • Regularizing RNNs...

    Recently, caption generation with an encoder-de...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Deep Cross-media ...

    Cross-media retrieval is a research hotspot in ...

  • Visual Reinforcem...

    For an autonomous agent to fulfill a wide range...

  • Joint Pose and Ex...

    Facial expression recognition (FER) is a challe...