资源论文On the Use of Variational Inference for Learning Discrete Graphical Models

On the Use of Variational Inference for Learning Discrete Graphical Models

2020-02-27 | |  56 |   47 |   0

Abstract

We study the general class of estimators for graphical model structure based on optimizing 图片.png-regularized approximate loglikelihood, where the approximate likelihood uses tractable variational approximations of the partition function. We provide a message-passing algorithm that directly computes the 图片.png regularized approximate MLE. Further, in the case of certain reweighted entropy approximations to the partition function, we show that surprisingly the 图片.png regularized approximate MLE estimator has a closed-form, so that we would no longer need to run through many iterations of approximate inference and message-passing. Lastly, we analyze this general class of estimators for graph structure recovery, or its sparsistency, and show that it is indeed sparsistent under certain conditions.

上一篇:Parallel Coordinate Descent for L1 -Regularized Loss Minimization

下一篇:Fast Newton-type Methods for Total Variation Regularization

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...