资源论文Discrete-Continuous Mixtures in Probabilistic Programming: Generalized Semantics and Inference Algorithms

Discrete-Continuous Mixtures in Probabilistic Programming: Generalized Semantics and Inference Algorithms

2020-03-19 | |  62 |   40 |   0

Abstract

Despite the recent successes of probabilistic programming languages (PPLs) in AI applications, PPLs offer only limited support for random variables whose distributions combine discrete and continuous elements. We develop the notion of measure-theoretic Bayesian networks (MTBNs) and use it to provide more general semantics for PPLs with arbitrarily many random variables defined over arbitrary measure spaces. We develop two new general sampling algorithms that are provably correct under the MTBN framework: the lexicographic likelihood weighting (LLW) for general MTBNs and the lexicographic particle filter (LPF), a specialized algorithm for st space models. We further integrate MTBNs into a widely used PPL system, BLOG, and verify the effectiveness of the new inference algorithms through representative examples.

上一篇:Data-Dependent Stability of Stochastic Gradient Descent

下一篇:CoVeR: Learning Covariate-Specific Vector Representations with Tensor Decompositions

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...