资源算法mixup_pytorch

mixup_pytorch

2020-01-17 | |  54 |   0 |   0

Mixup: Beyond Empirical Risk Minimization in PyTorch

This is an unofficial PyTorch implementation of mixup: Beyond Empirical Risk Minimization. The code is adapted from PyTorch CIFAR.

The results:

I only tested using CIFAR 10 and CIFAR 100. The network we used is PreAct ResNet-18. For mixup, we set alpha to be default value 1, meaning we sample the weight uniformly between zero and one. I trained 200 epochs for each setting. The learning rate is 0.1 (iter 1-100), 0.01 (iter 101-150) and 0.001 (iter 151-200). The batch size is 128.

Dataset and ModelAcc.
CIFAR 10 no mixup94.97%
CIFAR 10 mixup95.53%
CIFAR 100 no mixup76.53%
CIFAR 100 mixup77.83%

CIFAR 10 test accuracy evolution

图片.png

CIFAR 100 test accuracy evolution图片.png

Usage

# Train and test CIFAR 10 with mixup.
python main_cifar10.py --mixup --exp='cifar10_nomixup'
# Train and test CIFAR 10 without mixup.
python main_cifar10.py --exp='cifar10_nomixup'
# Train and test CIFAR 100 with mixup.
python main_cifar100.py --mixup --exp='cifar100_mixup'
# Train and test CIFAR 100 without mixup.
python main_cifar100.py --exp='cifar100_nomixup'


上一篇:manifold_mixup

下一篇:keras_mixup_generator

用户评价
全部评价

热门资源

  • TensorFlow-Course

    This repository aims to provide simple and read...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • mxnet_VanillaCNN

    This is a mxnet implementation of the Vanilla C...

  • vsepp_tensorflow

    Improving Visual-Semantic Embeddings with Hard ...

  • DuReader_QANet_BiDAF

    Machine Reading Comprehension on DuReader Usin...