资源论文Exact Hard Monotonic Attention for Character-Level Transduction

Exact Hard Monotonic Attention for Character-Level Transduction

2019-09-18 | |  130 |   55 |   0 0 0
Abstract Many common character-level, string-tostring transduction tasks, e.g. graphemeto-phoneme conversion and morphological inflection, consist almost exclusively of monotonic transduction. Neural sequence-tosequence models with soft attention, which are non-monotonic, often outperform popular monotonic models. In this work, we ask the following question: Is monotonicity really a helpful inductive bias in these tasks? We develop a hard attention sequence-to-sequence model that enforces strict monotonicity and learns a latent alignment jointly while learning to transduce. With the help of dynamic programming, we are able to compute the exact marginalization over all monotonic alignments. Our models achieve state-of-the-art performance on morphological inflection. Furthermore, we find strong performance on two other character-level transduction tasks. Code is available at https://github.com/ shijie-wu/neural-transducer

上一篇:Eliciting Knowledge from Experts:Automatic Transcript Parsing for Cognitive Task Analysis

下一篇:Give Me More Feedback II: Annotating Thesis Strength andRelated Attributes in Student Essays

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...