Improving Cross-Domain Performance for Relation Extraction via
Dependency Prediction and Information Flow Control
Abstract
Relation Extraction (RE) is one of the fundamental
tasks in Information Extraction and Natural Language Processing. Dependency trees have been
shown to be a very useful source of information
for this task. The current deep learning models
for relation extraction has mainly exploited this dependency information by guiding their computation
along the structures of the dependency trees. One
potential problem with this approach is it might
prevent the models from capturing important context information beyond syntactic structures and
cause the poor cross-domain generalization. This
paper introduces a novel method to use dependency
trees in RE for deep learning models that jointly
predicts dependency and semantics relations. We
also propose a new mechanism to control the information flow in the model based on the input entity mentions. Our extensive experiments on benchmark datasets show that the proposed model outperforms the existing methods for RE significantly