资源论文CRITICAL INITIALISATION IN CONTINUOUS APPROXI -MATIONS OF BINARY NEURAL NETWORKS

CRITICAL INITIALISATION IN CONTINUOUS APPROXI -MATIONS OF BINARY NEURAL NETWORKS

2019-12-30 | |  74 |   44 |   0

Abstract

The training of stochastic neural network models with binary (±1) weights and activations via continuous surrogate networks is investigated. We derive new surrogates using a novel derivation based on writing the stochastic neural network as a Markov chain. This derivation also encompasses existing variants of the surrogates presented in the literature. Following this, we theoretically study the surrogates at initialisation. We derive, using mean field theory, a set of scalar equations describing how input signals propagate through the randomly initialised networks. The equations reveal whether so-called critical initialisations exist for each surrogate network, where the network can be trained to arbitrary depth. Moreover, we predict theoretically and confirm numerically, that common weight initialisation schemes used in standard continuous networks, when applied to the mean values of the stochastic binary weights, yield poor training performance. This study shows that, contrary to common intuition, the means of the stochastic binary weights should be initialised close to ±1, for deeper networks to be trainable.

上一篇:ROBUST AND INTERPRETABLE BLIND IMAGEDENOISING VIA BIAS -FREE CONVOLUTIONALNEURAL NETWORKS

下一篇:SAMPLING -F REE LEARNING OFBAYESIAN QUANTIZED NEURAL NETWORKS

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...