资源论文Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units

Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units

2020-02-18 | |  40 |   24 |   0

Abstract 

This paper presents a general framework for norm-based capacity control for image.png weight normalized deep neural networks. We establish the upper bound on the Rademacher complexities of this family. With an image.png normalization where image.png and 1/p+1/p* = 1, we discuss properties of a width-independent capacity control, which only depends on the depth by a square root term. We further analyze the approximation properties of image.png weight normalized deep neural networks. In particular, for an image.png weight normalized network, the approximation error can be controlled by the image.png norm of the output layer, and the corresponding generalization error only depends on the architecture by the square root of the depth.

上一篇:Batch-Instance Normalization for Adaptively Style-Invariant Neural Networks

下一篇:Neural Tangent Kernel: Convergence and Generalization in Neural Networks

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...