资源论文Wide Compression: Tensor Ring Nets

Wide Compression: Tensor Ring Nets

2019-10-14 | |  40 |   36 |   0
Abstract Deep neural networks have demonstrated state-of-theart performance in a variety of real-world applications. In order to obtain performance gains, these networks have grown larger and deeper, containing millions or even billions of parameters and over a thousand layers. The tradeoff is that these large architectures require an enormous amount of memory, storage, and computation, thus limiting their usability. Inspired by the recent tensor ring factorization, we introduce Tensor Ring Networks (TR-Nets), which significantly compress both the fully connected layers and the convolutional layers of deep neural networks. Our results show that our TR-Nets approach is able to compress LeNet-5 by 11× without losing accuracy, and can compress the state-of-the-art Wide ResNet by 243× with only 2.3% degradation in Cifar10 image classification. Overall, this compression scheme shows promise in scientific computing and deep learning, especially for emerging resourceconstrained devices such as smartphones, wearables, and IoT devices

上一篇:Who’s Better? Who’s Best? Pairwise Deep Ranking for Skill Determination

下一篇:WILDTRACK: A Multi-camera HD Dataset for Dense Unscripted Pedestrian Detection

用户评价
全部评价

热门资源

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...