资源论文Zero-Shot Cross-Lingual Abstractive Sentence Summarization through Teaching Generation and Attention

Zero-Shot Cross-Lingual Abstractive Sentence Summarization through Teaching Generation and Attention

2019-09-19 | |  104 |   46 |   0 0 0
Abstract Abstractive Sentence Summarization (ASSUM) targets at grasping the core idea of the source sentence and presenting it as the summary. It is extensively studied using statistical models or neural models based on the large-scale monolingual source-summary parallel corpus. But there is no cross-lingual parallel corpus, whose source sentence language is different to the summary language, to directly train a cross-lingual ASSUM system. We propose to solve this zero-shot problem by using resource-rich monolingual ASSUM system to teach zero-shot cross-lingual ASSUM system on both summary word generation and attention. This teaching process is along with a back-translation process which simulates source-summary pairs. Experiments on cross-lingual ASSUM task show that our proposed method is significantly better than pipeline baselines and previous works, and greatly enhances the cross-lingual performances closer to the monolingual performances. We release the code and data at https://github.com/KelleyYin/ Cross-lingual-Summarization.

上一篇:Word-order biases in deep-agent emergent communication

下一篇:Cognitive Graph for Multi-Hop Reading Comprehension at Scale

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...