资源论文Semi-flat minima and saddle points by embedding neural networks to overparameterization

Semi-flat minima and saddle points by embedding neural networks to overparameterization

2020-02-25 | |  42 |   33 |   0

Abstract

We theoretically study the landscape of the training error for neural networks in overparameterized cases. We consider three basic methods for embedding a network into a wider one with more hidden units, and discuss whether a minimum point of the narrower network gives a minimum or saddle point of the wider one. Our results show that the networks with smooth and ReLU activation have different partially flat landscapes around the embedded point. We also relate these results to a difference of their generalization abilities in overparameterized realization.

上一篇:Dichotomize and Generalize: PAC-Bayesian Binary Activated Deep Neural Networks

下一篇:Powerset Convolutional Neural Networks

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...