资源论文Matching the Blanks: Distributional Similarity for Relation Learning

Matching the Blanks: Distributional Similarity for Relation Learning

2019-09-20 | |  138 |   52 |   0 0 0
Abstract General purpose relation extractors, which can model arbitrary relations, are a core aspiration in information extraction. Efforts have been made to build general purpose extractors that represent relations with their surface forms, or which jointly embed surface forms with relations from an existing knowledge graph. However, both of these approaches are limited in their ability to generalize. In this paper, we build on extensions of Harris’ distributional hypothesis to relations, as well as recent advances in learning text representations (specifically, BERT), to build task agnostic relation representations solely from entity-linked text. We show that these representations signifi- cantly outperform previous work on exemplar based relation extraction (FewRel) even without using any of that task’s training data. We also show that models initialized with our task agnostic representations, and then tuned on supervised relation extraction datasets, signifi- cantly outperform the previous methods on SemEval 2010 Task 8, KBP37, and TACRED

上一篇:Margin-based Parallel Corpus Mining with Multilingual Sentence Embeddings

下一篇:Miss Tools and Mr Fruit: Emergent communication in agents learning about object affordances

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...