资源论文Building a Regular Decision Boundary with Deep Networks

Building a Regular Decision Boundary with Deep Networks

2019-11-28 | |  46 |   38 |   0

Abstract In this work, we build a generic architecture of Convolutional Neural Networks to discover empirical properties of neural networks. Our fifirst contribution is to introduce a state-of-the-art framework that depends upon few hyper parameters and to study the network when we vary them. It has no max pooling, no biases, only 13 layers, is purely convolutional and yields up to 95.4% and 79.6% accuracy respectively on CIFAR10 and CIFAR100. We show that the nonlinearity of a deep network does not need to be continuous, non expansive or point-wise, to achieve good performance. We show that increasing the width of our network permits being competitive with very deep networks. Our second contribution is an analysis of the contraction and separation properties of this network. Indeed, a 1-nearest neighbor classififier applied on deep features progressively improves with depth, which indicates that the representation is progressively more regular. Besides, we defifined and analyzed local support vectors that separate classes locally. All our experiments are reproducible and code is available online, based on TensorFlow

上一篇:BIND: Binary Integrated Net Descriptors for Texture-less Object Recognition

下一篇:Can Walking and Measuring along Chord Bunches Better Describe Leaf Shapes?

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...