资源论文PerforatedCNNs: Acceleration through Elimination of Redundant Convolutions

PerforatedCNNs: Acceleration through Elimination of Redundant Convolutions

2020-02-05 | |  93 |   51 |   0

Abstract 

We propose a novel approach to reduce the computational cost of evaluation of convolutional neural networks, a factor that has hindered their deployment in lowpower devices such as mobile phones. Inspired by the loop perforation technique from source code optimization, we speed up the bottleneck convolutional layers by skipping their evaluation in some of the spatial positions. We propose and analyze several strategies of choosing these positions. We demonstrate that perforation can accelerate modern convolutional networks such as AlexNet and VGG-16 by a factor of image.png. Additionally, we show that perforation is complementary to the recently proposed acceleration method of Zhang et al. [28].

上一篇:Iterative Refinement of the Approximate Posterior for Directed Belief Networks

下一篇:Exploiting Tradeoffs for Exact Recovery in Heterogeneous Stochastic Block Models

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...