Zero-Shot Cross-Lingual Abstractive Sentence Summarization through
Teaching Generation and Attention
Abstract
Abstractive Sentence Summarization (ASSUM) targets at grasping the core idea of the
source sentence and presenting it as the summary. It is extensively studied using statistical models or neural models based on the
large-scale monolingual source-summary parallel corpus. But there is no cross-lingual
parallel corpus, whose source sentence language is different to the summary language,
to directly train a cross-lingual ASSUM system. We propose to solve this zero-shot problem by using resource-rich monolingual ASSUM system to teach zero-shot cross-lingual
ASSUM system on both summary word generation and attention. This teaching process is along with a back-translation process which simulates source-summary pairs.
Experiments on cross-lingual ASSUM task
show that our proposed method is significantly
better than pipeline baselines and previous
works, and greatly enhances the cross-lingual
performances closer to the monolingual performances. We release the code and data
at https://github.com/KelleyYin/
Cross-lingual-Summarization.