资源算法CIFAR-10 on Pytorch with VGG, ResNet and DenseNet

CIFAR-10 on Pytorch with VGG, ResNet and DenseNet

2019-09-11 | |  119 |   0 |   0

Train CIFAR10 with PyTorch

I'm playing with PyTorch on the CIFAR10 dataset.

Pros & cons

Pros: - Built-in data loading and augmentation, very nice! - Training is fast, maybe even a little bit faster. - Very memory efficient!

Cons: - No progress bar, sad :( - No built-in log.

Accuracy

| Model | Acc. | | ----------------- | ----------- | | VGG16 | 92.64% | | ResNet18 | 93.02% | | ResNet50 | 93.62% | | ResNet101 | 93.75% | | MobileNetV2 | 94.43% | | ResNeXt29(32x4d) | 94.73% | | ResNeXt29(2x64d) | 94.82% | | DenseNet121 | 95.04% | | PreActResNet18 | 95.11% | | DPN92 | 95.16% |

Learning rate adjustment

I manually change the lr during training: - 0.1 for epoch [0,150) - 0.01 for epoch [150,250) - 0.001 for epoch [250,350)

Resume the training with python main.py --resume --lr=0.01

上一篇:Transformer-XL

下一篇:3DDFA

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...