Abstract
Neural generative models for open-domain
chit-chat conversations have become an active area of research in recent years. A critical issue with most existing generative models
is that the generated responses lack informativeness and diversity. A few researchers attempt to leverage the results of retrieval models to strengthen the generative models, but
these models are limited by the quality of
the retrieval results. In this work, we propose a memory-augmented generative model,
which learns to abstract from the training corpus and saves the useful information to the
memory to assist the response generation. Our
model clusters query-response samples, extracts characteristics of each cluster, and learns
to utilize these characteristics for response
generation. Experimental results show that our
model outperforms other competitive baselines.