资源算法forward-thinking-pytorch

forward-thinking-pytorch

2019-09-18 | |  103 |   0 |   0

forward-thinking-pytorch

Pytorch implementation of Forward Thinking: Building and Training Neural Networks One Layer at a Time

Requirements

Usage

  $ ./run.sh

to run forward-thinking & forward-thinking (deep) & backpropagation & backpropagation (deep).

For forward-thinking experiment (5 layers),

$ python forward_thinking.py

If you want to do this with very deep model (20 layers),

$ python forward_thinking.py --epoch 200 --deep

As a comparison, learn by backpropagation:

$ python backpropagate.py

Backpropagate with very deep model (20 layers),

$ python backpropagate.py --epoch 200 --deep

Result

For 5 layers, forward-thinking learns slightly faster than backpropagation. Dips are observed when new layers are added.

When the model becomes very deep (20 layers), backpropagation cannot train the model. On the other hand, forward-thinking can train the model as usual.


上一篇:Neural-Style-MMD

下一篇:interaction_network_pytorch

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...