Abstract
While word embeddings have been shown to
implicitly encode various forms of attributional knowledge, the extent to which they
capture relational information is far more limited. In previous work, this limitation has been
addressed by incorporating relational knowledge from external knowledge bases when
learning the word embedding. Such strategies
may not be optimal, however, as they are limited by the coverage of available resources and
conflate similarity with other forms of relatedness. As an alternative, in this paper we propose to encode relational knowledge in a separate word embedding, which is aimed to be
complementary to a given standard word embedding. This relational word embedding is
still learned from co-occurrence statistics, and
can thus be used even when no external knowledge base is available. Our analysis shows that
relational word vectors do indeed capture information that is complementary to what is encoded in standard word embeddings