Abstract. We present Swapnet, a framework to transfer garments across
images of people with arbitrary body pose, shape, and clothing. Garment transfer is a challenging task that requires (i) disentangling the
features of the clothing from the body pose and shape and (ii) realistic synthesis of the garment texture on the new body. We present a
neural network architecture that tackles these sub-problems with two
task-specific sub-networks. Since acquiring pairs of images showing the
same clothing on different bodies is difficult, we propose a novel weaklysupervised approach that generates training pairs from a single image
via data augmentation. We present the first fully automatic method for
garment transfer in unconstrained images without solving the difficult
3D reconstruction problem. We demonstrate a variety of transfer results
and highlight our advantages over traditional image-to-image and analogy pipelines