Employing the Correspondence of Relations and Connectives
to Identify Implicit Discourse Relations via Label Embeddings
Abstract
It has been shown that implicit connectives can
be exploited to improve the performance of the
models for implicit discourse relation recognition (IDRR). An important property of the implicit connectives is that they can be accurately
mapped into the discourse relations conveying
their functions. In this work, we explore this
property in a multi-task learning framework
for IDRR in which the relations and the connectives are simultaneously predicted, and the
mapping is leveraged to transfer knowledge
between the two prediction tasks via the embeddings of relations and connectives. We propose several techniques to enable such knowledge transfer that yield the state-of-the-art performance for IDRR on several settings of the
benchmark dataset (i.e., the Penn Discourse
Treebank dataset).