资源论文Lifted Probabilistic Inference by First-Order Knowledge Compilation

Lifted Probabilistic Inference by First-Order Knowledge Compilation

2019-11-12 | |  101 |   50 |   0

Abstract

Probabilistic logical languages provide power-ful formalisms for knowledge representation and learning.  Yet performing inference in these lan-guages is extremely costly, especially if it is done at the propositional level.  Lifted inference algo-rithms, which avoid repeated computation by treat-ing indistinguishable groups of objects as one, help mitigate this cost. Seeking

inspiration from logical inference, where lifted inference (e.g., resolution) is commonly performed, we develop a model theo-

retic approach to probabilistic lifted inference. Our algorithm compiles a first-order probabilistic the-ory into a first-order deterministic decomposable negation normal form (d-DNNF) circuit. Compi-lation offers the advantage that inference is poly-

nomial in the size of the circuit. Furthermore, by borrowing techniques from the knowledge compi-lation literature our algorithm effectively exploits the logical structure (e.g., context-specific indepen-dencies) within the first-order model, which allows more computation to be done at the lifted level.An empirical comparison demonstrates the utility of the proposed approach.


上一篇:Finding ( α, θ )-Solutions via Sampled SCSPs

下一篇:Learning Optimal Bayesian Networks Using A* Search

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...