Abstract
Sentence ordering is to restore the original paragraph from a set of sentences. It involves capturing global dependencies among sentences regardless of their input order. In this paper, we
propose a novel and flexible graph-based neural
sentence ordering model, which adopts graph recurrent network [Zhang et al., 2018] to accurately learn semantic representations of the sentences. Instead of assuming connections between all pairs of input sentences, we use entities that are shared among multiple sentences to
make more expressive graph representations with
less noise. Experimental results show that our
proposed model outperforms the existing stateof-the-art systems on several benchmark datasets,
demonstrating the effectiveness of our model. We
also conduct a thorough analysis on how entities
help the performance. Our code is available at
https://github.com/DeepLearnXMU/NSEG.git