资源论文Stochastic Variance-Reduced Hamilton Monte Carlo Methods

Stochastic Variance-Reduced Hamilton Monte Carlo Methods

2020-03-16 | |  34 |   34 |   0

Abstract

We propose a fast stochastic Hamilton Monte Carlo (HMC) method, for sampling from a smooth and strongly log-concave distribution. At the core of our proposed method is a variance reduction technique inspired by the recent advance in stochastic optimization. We show that, to achieve ? accuracy in 2-Wasserstein distance, our algorithm achieves 图片.png 图片.png gradient complexity (i.e., number of component gradient evaluations), which outperforms the state-of-the-art HMC and stochastic gradient HMC methods in a wide regime. We also extend our algorithm for sampling from smooth and general log-concave distributions, and prove the corresponding gradient complexity as well. Experiments on both synthetic and real data demonstrate the superior performance of our algorithm.

上一篇:Generative Temporal Models with Spatial Memory for Partially Observed Environments

下一篇:A Two-Step Computation of the Exact GAN Wasserstein Distance

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...