Maximum Expected Likelihood Estimation for
Zero-Resource Neural Machine Translation
Abstract
While neural machine translation (NMT) has made
remarkable progress in translating a handful of
resource-rich language pairs recently, parallel corpora are not always readily available for most language pairs. To deal with this problem, we propose an approach to zero-resource NMT via maximum expected likelihood estimation. The basic
idea is to maximize the expectation with respect
to a pivot-to-source translation model for the intended source-to-target model on a pivot-target parallel corpus. To approximate the expectation, we
propose two methods to connect the pivot-to-source
and source-to-target models. Experiments on two
zero-resource language pairs show that the proposed approach yields substantial gains over baseline methods. We also observe that when trained
jointly with the source-to-target model, the pivotto-source translation model also obtains improvements over independent training