资源论文MintNet: Building Invertible Neural Networks with Masked Convolutions

MintNet: Building Invertible Neural Networks with Masked Convolutions

2020-02-25 | |  72 |   56 |   0

Abstract

We propose a new way of constructing invertible neural networks by combining simple building blocks with a novel set of composition rules. This leads to a rich set of invertible architectures, including those similar to ResNets. Inversion is achieved with a locally convergent iterative procedure that is parallelizable and very fast in practice. Additionally, the determinant of the Jacobian can be computed analytically and efficiently, enabling their generative use as flow models. To demonstrate their flexibility, we show that our invertible neural networks are competitive with ResNets on MNIST and CIFAR-10 classification. When trained as generative models, our invertible networks achieve competitive likelihoods on MNIST, CIFAR-10 and ImageNet 32?2, with bits per dimension of 0.98, 3.32 and 4.06 respectively.

上一篇:Adapting Neural Networks for the Estimation of Treatment Effects

下一篇:Modeling Conceptual Understanding in Image Reference Games

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...