资源论文Probabilistic Variational Bounds for Graphical Models

Probabilistic Variational Bounds for Graphical Models

2020-02-04 | |  58 |   43 |   0

Abstract 

Variational algorithms such as tree-reweighted belief propagation can provide deterministic bounds on the partition function, but are often loose and difficult to use in an “any-time” fashion, expending more computation for tighter bounds. On the other hand, Monte Carlo estimators such as importance sampling have excellent any-time behavior, but depend critically on the proposal distribution. We propose a simple Monte Carlo based inference method that augments convex variational bounds by adding importance sampling (IS). We argue that convex variational methods naturally provide good IS proposals that “cover” the target probability, and reinterpret the variational optimization as designing a proposal to minimize an upper bound on the variance of our IS estimator. This both provides an accurate estimator and enables construction of any-time probabilistic bounds that improve quickly and directly on state-of-the-art variational bounds, and provide certificates of accuracy given enough samples relative to the error in the initial bound.

上一篇:Recursive Training of 2D-3D Convolutional Networks for Neuronal Boundary Detection

下一篇:Bayesian Optimization with Exponential Convergence

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...