资源论文Barack’s Wife Hillary: Using Knowledge Graphs for Fact-Aware Language Modeling

Barack’s Wife Hillary: Using Knowledge Graphs for Fact-Aware Language Modeling

2019-09-19 | |  171 |   89 |   0 0 0
Abstract Modeling human language requires the ability to not only generate fluent text but also encode factual knowledge. However, traditional language models are only capable of remembering facts seen at training time, and often have difficulty recalling them. To address this, we introduce the knowledge graph language model (KGLM), a neural language model with mechanisms for selecting and copying facts from a knowledge graph that are relevant to the context. These mechanisms enable the model to render information it has never seen before, as well as generate out-of-vocabulary tokens. We also introduce the Linked WikiText- 2 dataset,1 a corpus of annotated text aligned to the Wikidata knowledge graph whose contents (roughly) match the popular WikiText-2 benchmark (Merity et al., 2017). In experiments, we demonstrate that the KGLM achieves signifi- cantly better performance than a strong baseline language model. We additionally compare different language models’ ability to complete sentences requiring factual knowledge, and show that the KGLM outperforms even very large language models in generating facts

上一篇:Are You Looking? Grounding to Multiple Modalities in Vision-and-Language Navigation

下一篇:Better Character Language Modeling Through Morphology

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...