资源算法ShuffleNetV2_vs_MnasNet.PyTorch

ShuffleNetV2_vs_MnasNet.PyTorch

2020-02-26 | |  38 |   0 |   0

ShuffleNetV2_vs_MnasNet.PyTorch

Non-official implement of Paper:MnasNet: Platform-Aware Neural Architecture Search for Mobile

Non-official implement of Paper:ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design

Requirements

  • Python3

  • PyTorch 0.4.1

  • tensorboardX (optional)

  • torchnet

  • pretrainedmodels (optional)

Results

We just test four models in ImageNet-1K, both train set and val set are scaled to 256(minimal side), only use Mirror and RandomResizeCrop as training data augmentation, during validation, we use center crop to get 224x224 patch.

CPU Info: Intel(R) Xeon(R) CPU E5-2650 v3 @ 2.30GHz

ImageNet-1K

Modelsvalidation(Top-1)validation(Top-5)CPU Cost(ms)
MnasNet64.9186.28~300
ShuffleNetV2 x161.8383.99~100

Note

Maybe the implement of these network have some different from origin method, we can not achieve the best performance as said in the paper.


上一篇:shufflenet-v1-and-v2

下一篇:ShuffleNetV2-ncnn

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...