资源论文R?yi Divergence Variational Inference

R?yi Divergence Variational Inference

2020-02-05 | |  54 |   34 |   0

Abstract 

This paper introduces the variational Rényi bound (VR) that extends traditional variational inference to Rényi’s image.png-divergences. This new family of variational methods unifies a number of existing approaches, and enables a smooth interpolation from the evidence lower-bound to the log (marginal) likelihood that is controlled by the value of image.png that parametrises the divergence. The reparameterization trick, Monte Carlo approximation and stochastic optimisation methods are deployed to obtain a tractable and unified framework for optimisation. We further consider negative image.png values and propose a novel variational inference method as a new special case in the proposed framework. Experiments on Bayesian neural networks and variational auto-encoders demonstrate the wide applicability of the VR bound.

上一篇:Blind Regression: Nonparametric Regression for Latent Variable Models via Collaborative Filtering

下一篇:Generalization of ERM in Stochastic Convex Optimization:The Dimension Strikes Back

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...