Abstract
Many artificial intelligence tasks appear in dual forms like English?French translation and speech?text transformation. Existing dual learning schemes, which are proposed to solve a pair of such dual tasks, explore how to leverage such dualities from data level. In this work, we propose a new learning framework, model-level dual learning, which takes duality of tasks into consideration while designing the architectures for the primal/dual models, and ties the model parameters that playing similar roles in the two tasks. We study both symmetric and asymmetric modellevel dual learning. Our algorithms achieve signif icant improvements on neural machine translation and sentiment analysis.