资源论文Stein Variational Message Passing for Continuous Graphical Models

Stein Variational Message Passing for Continuous Graphical Models

2020-03-16 | |  59 |   34 |   0

Abstract

We propose a novel distributed inference algorithm for continuous graphical models, by extending Stein variational gradient descent (SVGD) (Liu & Wang, 2016) to leverage the Markov dependency structure of the distribution of interest. Our approach combines SVGD with a set of structured local kernel functions defined on the Markov blanket of each node, which alleviates the curse of high dimensionality and simultaneously yields a distributed algorithm for decentralized inference tasks. We justify our method with theoretical analysis and show that the use of local kernels can be viewed as a new type of localized approximation that matches the target distribution on the conditional distributions of each node over its Markov blanket. Our empirical results show that our method outperforms a variety of baselines including standard MCMC and particle message passing methods.

上一篇:Towards Binary-Valued Gates for Robust LSTM Training

下一篇:Differentiable Dynamic Programming for Structured Prediction and Attention

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...