资源算法like_i_revnet_pytorch

like_i_revnet_pytorch

2020-02-17 | |  33 |   0 |   0

like_i_revnet_pytorch

This is a model similar to irevnet and follows the same principle as irevnet.

In addition, I wrote the backward function so that it can really reduce the VRAM usage and can be connected with other irreversible modules.

i-revnet is a very surprising method.
This method saves a lot of video memory and allows me to train larger models.

Dependent

Currently I have only tested it in pytorch1.3.1 .

How it works

TODO: Write when I have time...

How to test

I used the cifar10 dataset for testing.

  1. Download this repository

  2. Run python3 train_on_cifar10_with_rev_backward.py

  3. Use nvidia-smi to observe how much video memory is used for training.

  4. Kill the program.

  5. Run python3 train_on_cifar10_without_rev_backward.py

  6. Check the VRAM usage again.

  7. Not surprisingly, the second VRAM occupies about twice as much as the first.

How to apply it to your own projects

The explanation is a bit difficult, I suggest you look directly at the code.

References

https://openreview.net/forum?id=HJsjkMb0Z
https://github.com/jhjacobsen/pytorch-i-revnet


上一篇:RevNet-CycleGan

下一篇: tweets-nlp-elasticsearch

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...