Abstract
In this paper we propose a noval method to generate
the specified network parameters through one feed-forward
propagation in the meta networks for neural style transfer.
Recent works on style transfer typically need to train image
transformation networks for every new style, and the style is
encoded in the network parameters by enormous iterations
of stochastic gradient descent, which lacks the generalization ability to new style in the inference stage. To tackle
these issues, we build a meta network which takes in the
style image and generates a corresponding image transformation network directly. Compared with optimization-based
methods for every style, our meta networks can handle an
arbitrary new style within 19 milliseconds on one modern
GPU card. The fast image transformation network generated
by our meta network is only 449 KB, which is capable of
real-time running on a mobile device. We also investigate
the manifold of the style transfer networks by operating the
hidden features from meta networks. Experiments have well
validated the effectiveness of our method. Code and trained
models will be released.