资源论文DYNAMICALLY PRUNED MESSAGE PASSING NET-WORKS FOR LARGE -SCALE KNOWLEDGE GRAPHR EASONING

DYNAMICALLY PRUNED MESSAGE PASSING NET-WORKS FOR LARGE -SCALE KNOWLEDGE GRAPHR EASONING

2020-01-02 | |  86 |   46 |   0

Abstract
We propose Dynamically Pruned Message Passing Networks (DPMPN) for largescale knowledge graph reasoning. In contrast to existing models, embeddingbased or path-based, we learn an input-dependent subgraph to explicitly model reasoning process. Subgraphs are dynamically constructed and expanded by applying graphical attention mechanism conditioned on input queries. In this way, we not only construct graph-structured explanations but also enable message passing designed in Graph Neural Networks (GNNs) to scale with graph sizes. We take the inspiration from the consciousness prior proposed by Bengio (2017) and develop a two-GNN framework to encode input-agnostic full structure representation and learn input-dependent local one coordinated by an attention module. Experiments show the reasoning capability of our model to provide clear graphical explanations as well as predict results accurately, outperforming most stateof-the-art methods in knowledge base completion tasks.

上一篇:DIFFERENTIABLE LEARNING OF NUMERICAL RULES INKNOWLEDGE GRAPHS

下一篇:DEEP SYMBOLIC SUPEROPTIMIZATION WITHOUTH UMAN KNOWLEDGE

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...