资源论文Random Feature Stein Discrepancies

Random Feature Stein Discrepancies

2020-02-13 | |  70 |   43 |   0

Abstract

 Computable Stein discrepancies have been deployed for a variety of applications, ranging from sampler selection in posterior inference to approximate Bayesian inference to goodness-of-fit testing. Existing convergence-determining Stein discrepancies admit strong theoretical guarantees but suffer from a computational cost that grows quadratically in the sample size. While linear-time Stein discrepancies have been proposed for goodness-of-fit testing, they exhibit avoidable degradations in testing power—even when power is explicitly optimized. To address these shortcomings, we introduce feature Stein discrepancies (image.png ), a new family of quality measures that can be cheaply approximated using importance sampling. We show how to construct image.png that provably determine the convergence of a sample to its target and develop high-accuracy approximations—random image.png (Rimage.png)—which are computable in near-linear time. In our experiments with sampler selection for approximate posterior inference and goodness-of-fit testing, Rimage.png perform as well or better than quadratic-time KSDs while being orders of magnitude faster to compute.

上一篇:Entropy Rate Estimation for Markov Chains with Large State Space

下一篇:M-Walk: Learning to Walk over Graphs using Monte Carlo Tree Search

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...