Abstract
Wasserstein Generative Adversarial Nets (GANs)
are newly proposed GAN algorithms and widely
used in computer vision, web mining, information retrieval, etc. However, the existing algorithms with approximated Wasserstein loss converge slowly due to heavy computation cost and
usually generate unstable results as well. In this
paper, we solve the computation cost problem by
speeding up the Wasserstein GANs from a welldesigned communication efficient parallel architecture. Specifically, we develop a new problem
formulation targeting the accurate evaluation of
Wasserstein distance and propose an easily parallel optimization algorithm to train the Wasserstein
GANs. Compared to traditional parallel architecture, our proposed framework is designed explicitly
for the skew parameter updates between the generator network and discriminator network. Rigorous
experiments reveal that our proposed framework
achieves a significant improvement regarding convergence speed with comparable stability on generating images, compared to the state-of-the-art of
Wasserstein GANs algorithms.