资源论文Provable Gradient Variance Guarantees for Black-Box Variational Inference

Provable Gradient Variance Guarantees for Black-Box Variational Inference

2020-02-23 | |  43 |   42 |   0

Abstract

Recent variational inference methods use stochastic gradient estimators whose variance is not well understood. Theoretical guarantees for these estimators are important to understand when these methods will or will not work. This paper gives bounds for the common “reparameterization” estimators when the target is smooth and the variational family is a location-scale distribution. These bounds are unimprovable and thus provide the best possible guarantees under the stated assumptions.

上一篇:NeurVPS: Neural Vanishing Point Scanning via Conic Convolution

下一篇:Wasserstein Weisfeiler-Lehman Graph Kernels

用户评价
全部评价

热门资源

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...