资源算法nalu.pytorch

nalu.pytorch

2020-02-10 | |  62 |   0 |   0

Summary

This is a PyTorch implementation of Neural Arithmetic Logic Units from DeepMind.

The goal of this paper is to count using neural networks. Two networks are presented, NAC and NALU. The first is used for addition & substraction while the second is used for multiplication, division, square, and square root.

The paper was implemented in the python module nalu.py.

图片.png


Tests

Several tests are shown in the paper, I've only implemented what they called the Static (non-recurrent) arithmetic tests (appendix B) where the network must learn several kinds of operations.

It is first train on a range of number, then it is tested on a range of number the network never saw: interpolation & extrapolation.

I've used the range [1, 100] for the first task and [101, 200] for the second.

The following image shows the results. See the jupyter notebook train.ipynb if you want to know the full train procedure.


References

@misc{1808.00508,
Author = {Andrew Trask and Felix Hill and Scott Reed and Jack Rae and Chris Dyer and Phil Blunsom},
Title = {Neural Arithmetic Logic Units},
Year = {2018},
Eprint = {arXiv:1808.00508},
}


  • © 2020 GitHub, Inc.


上一篇:NALU-Keras

下一篇:tensorflow-nalu

用户评价
全部评价

热门资源

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...