资源论文Automatic Grammatical Error Correction for Sequence-to-sequence Text Generation: An Empirical Study

Automatic Grammatical Error Correction for Sequence-to-sequence Text Generation: An Empirical Study

2019-09-19 | |  114 |   63 |   0

Abstract Sequence-to-sequence (seq2seq) models have achieved tremendous success in text generation tasks. However, there is no guarantee that they can always generate sentences without grammatical errors. In this paper, we present a preliminary empirical study on whether and how much automatic grammatical error correction can help improve seq2seq text generation. We conduct experiments across various seq2seq text generation tasks including machine translation, formality style transfer, sentence compression and simplifification. Experiments show the state-of-the-art grammatical error correction system can improve the grammaticality of generated text and can bring taskoriented improvements in the tasks where target sentences are in a formal style

上一篇:Asking the Crowd: Question Analysis, Evaluation and Generation for Open Discussion on Online Forums

下一篇:BAM! Born-Again Multi-Task Networks for Natural Language Understanding

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...