资源论文Enhancing Semantic Representations of Bilingual Word Embeddings with Syntactic Dependencies

Enhancing Semantic Representations of Bilingual Word Embeddings with Syntactic Dependencies

2019-11-07 | |  61 |   41 |   0
Abstract Cross-lingual representation is a technique that can both represent different languages in the same latent vector space and enable the knowledge transfer across languages. To learn such representations, most of existing works require parallel sentences with word-level alignments and assume that aligned words have similar Bag-of-Words (BoW) contexts. However, due to differences in grammar structures among different languages, the contexts of aligned words in different languages may appear at different positions of the sentences. To address this issue of different syntactics across different languages, we propose a model of bilingual word embedding integrating syntactic dependencies (DepBiWE) by producing dependency parsetrees which encode the accurate relative positions for the contexts of aligned words. In addition, a new method is proposed to learn bilingual word embeddings from dependency-based contexts and BoW contexts jointly. Extensive experimental results on a real world dataset clearly validate the superiority of the proposed model DepBiWE on various natural language processing (NLP) tasks.

上一篇:Scheduled Policy Optimization for Natural Language Communication with Intelligent Agents

下一篇:Towards Reading Comprehension for Long Documents

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...