资源算法VGG 4x without degradation

VGG 4x without degradation

2019-09-20 | |  82 |   0 |   0

An algorithm that effectively prune channels each layer, which could accelerate VGG-16 4x without degradation. Accepted as a poster at _ICCV 2017_.

latex<br/>@article{he2017channel,<br/> title={Channel Pruning for Accelerating Very Deep Neural Networks},<br/> author={He, Yihui and Zhang, Xiangyu and Sun, Jian},<br/> journal={arXiv preprint arXiv:1707.06168},<br/> year={2017}<br/>}<br/>
Models:
VGG-16 4x 10.1% top-5 error and 29.4% top-1 error on ILSVRC-2012.

[[PDF](https://arxiv.org/abs/1707.06168)] [[Github repo](https://github.com/yihui-he/channel-pruning)]

无链接

上一篇:Striving for Simplicity

下一篇:Using Ranking-CNN for Age Estimation

用户评价
全部评价

热门资源

  • TensorFlow-Course

    This repository aims to provide simple and read...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • mxnet_VanillaCNN

    This is a mxnet implementation of the Vanilla C...

  • DuReader_QANet_BiDAF

    Machine Reading Comprehension on DuReader Usin...

  • Klukshu-Sockeye-...

    KLUKSHU SOCKEYE PROJECTS 2016 This repositor...