Abstract
General purpose relation extractors, which can
model arbitrary relations, are a core aspiration
in information extraction. Efforts have been
made to build general purpose extractors that
represent relations with their surface forms, or
which jointly embed surface forms with relations from an existing knowledge graph. However, both of these approaches are limited in
their ability to generalize. In this paper, we
build on extensions of Harris’ distributional
hypothesis to relations, as well as recent advances in learning text representations (specifically, BERT), to build task agnostic relation
representations solely from entity-linked text.
We show that these representations signifi-
cantly outperform previous work on exemplar
based relation extraction (FewRel) even without using any of that task’s training data. We
also show that models initialized with our task
agnostic representations, and then tuned on supervised relation extraction datasets, signifi-
cantly outperform the previous methods on SemEval 2010 Task 8, KBP37, and TACRED