资源论文Maximum Expected Likelihood Estimation for Zero-Resource Neural Machine Translation

Maximum Expected Likelihood Estimation for Zero-Resource Neural Machine Translation

2019-10-29 | |  35 |   29 |   0
Abstract While neural machine translation (NMT) has made remarkable progress in translating a handful of resource-rich language pairs recently, parallel corpora are not always readily available for most language pairs. To deal with this problem, we propose an approach to zero-resource NMT via maximum expected likelihood estimation. The basic idea is to maximize the expectation with respect to a pivot-to-source translation model for the intended source-to-target model on a pivot-target parallel corpus. To approximate the expectation, we propose two methods to connect the pivot-to-source and source-to-target models. Experiments on two zero-resource language pairs show that the proposed approach yields substantial gains over baseline methods. We also observe that when trained jointly with the source-to-target model, the pivotto-source translation model also obtains improvements over independent training

上一篇:Locally Linear Factorization Machines

下一篇:Multi-Class Support Vector Machine via Maximizing Multi-Class Margins

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...