资源论文Large Scale Variational Bayesian Inference for Structured Scale Mixture Models

Large Scale Variational Bayesian Inference for Structured Scale Mixture Models

2020-02-28 | |  50 |   35 |   0

Abstract

Natural image statistics exhibit hierarchical dependencies across multiple scales. Representing such prior knowledge in non-factorial latent tree models can boost performance of image denoising, inpainting, deconvolution or reconstruction substantially, beyond standard factorial “sparse” methodology. We derive a large scale approximate Bayesian inference algorithm for linear models with nonfactorial (latent tree-structured) scale mixture priors. Experimental results on a range of denoising and inpainting problems demonstrate substantially improved performance compared to MAP estimation or to inference with factorial priors.

上一篇:Is margin preserved after random projection?

下一篇:Canonical Trends: Detecting Trend Setters in Web Data

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Learning to learn...

    The move from hand-designed features to learned...