资源论文MIRROR -G ENERATIVE NEURAL MACHINE TRANSLA -TION

MIRROR -G ENERATIVE NEURAL MACHINE TRANSLA -TION

2019-12-30 | |  79 |   52 |   0

Abstract
Training neural machine translation models (NMT) requires a large amount of parallel corpus, which is scarce for many language pairs. However, raw non-parallel corpora are often easy to obtain. Existing approaches have not exploited the full potential of non-parallel bilingual data either in training or decoding. In this paper, we propose the mirror-generative NMT (MG NMT), a single unified architecture that simultaneously integrates the source to target translation model, the target to source translation model, and two language models. Both translation models and language models share the same latent semantic space, therefore both translation directions can learn from non-parallel data more effectively. Besides, the translation models and language models can collaborate together during decoding. Our experiments show that the proposed MG NMT consistently outperforms existing approaches in all a variety of scenarios and language pairs, including resourcerich and low-resource languages.

上一篇:INCORPORATING BERT INTO NEURAL MACHINET RANSLATION

下一篇:U-GAT-IT: UNSUPERVISED GENERATIVE ATTEN -TIONAL NETWORKS WITH ADAPTIVE LAYER -I NSTANCE NORMALIZATION FOR IMAGE -TO -I MAGET RANSLATION

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...