资源论文Look Harder: A Neural Machine Translation Model with Hard Attention

Look Harder: A Neural Machine Translation Model with Hard Attention

2019-09-19 | |  96 |   42 |   0 0 0
Abstract Soft-attention based Neural Machine Translation (NMT) models have achieved promising results on several translation tasks. These models attend all the words in the source sequence for each target token, which makes them ineffective for long sequence translation. In this work, we propose a hard-attention based NMT model which selects a subset of source tokens for each target token to effectively handle long sequence translation. Due to the discrete nature of the hard-attention mechanism, we design a reinforcement learning algorithm coupled with reward shaping strategy to efficiently train it. Experimental results show that the proposed model performs better on long sequences and thereby achieves significant BLEU score improvement on English-German (EN-DE) and EnglishFrench (EN-FR) translation tasks compared to the soft-attention based NMT

上一篇:Lattice-Based Transformer Encoder for Neural Machine Translation

下一篇:Monotonic Infinite Lookback Attention for Simultaneous Machine Translation

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...