资源算法NALU-pytorch

NALU-pytorch

2020-02-10 | |  40 |   0 |   0

Neural Arithmetic Logic Units

[WIP]

This is a PyTorch implementation of Neural Arithmetic Logic Units by Andrew Trask, Felix Hill, Scott Reed, Jack Rae, Chris Dyer and Phil Blunsom.

图片.png

API

from models import *# single layer modulesNeuralAccumulatorCell(in_dim, out_dim)
NeuralArithmeticLogicUnitCell(in_dim, out_dim)# stacked layersNAC(num_layers, in_dim, hidden_dim, out_dim)
NALU(num_layers, in_dim, hidden_dim, out_dim)

Experiments

To reproduce "Numerical Extrapolation Failures in Neural Networks" (Section 1.1), run:

python failures.py

This should generate the following plot:

图片.png

To reproduce "Simple Function Learning Tasks" (Section 4.1), run:

python function_learning.py

This should generate a text file called interpolation.txt with the following results. (Currently only supports interpolation, I'm working on the rest)


Relu6NoneNACNALU
a + b4.4720.1320.1540.157
a - b85.7272.2242.40334.610
a * b89.2574.5735.3821.236
a / b97.07060.5945.7303.042
a ^ 289.9872.9774.7181.117
sqrt(a)5.93940.2437.2631.119

Notes

  • RMSprop works really well with NAC and NALU

  • high learning rate (0.01) does a good job as well


上一篇:Skip-Thought-Vector

下一篇:stable-nalu

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...