资源算法tensorflow-nalu

tensorflow-nalu

2020-02-10 | |  42 |   0 |   0

Tensorflow NALU

A tensorflow implementation of Neural Arithmetic Logic Unit, Trask et al. Paper: https://arxiv.org/abs/1808.00508

Modifications to the original model

The NALU cell proposed in the paper had a major drawback that it can't model multiplication of negative inputs since multiplication is implemented as addition in log-space.

A solution to this was to avoid working in the log-space of the inputs and instead use the asinh function as porposed here: https://www.reddit.com/r/MachineLearning/comments/94833t/neural_arithmetic_logic_units/e3u974x/

Getting Started

from nalu.layers.nalu_layer import NaluLayer

nalu_layer = NaluLayer(FEATURES_NUM, OUTPUT_DIM, HIDDEN_DIM, N_CELLS, CORE_CELL_TYPE)
outputs = nalu_layer(input)


上一篇:nalu.pytorch

下一篇:NALUnit.js

用户评价
全部评价

热门资源

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...