资源算法TCN-with-attention

TCN-with-attention

2020-01-10 | |  43 |   0 |   0

TCN with attention

Temporal Convolutional Network with attention layer

Concept of model is mostly like Simple Neural Attentive Meta-Learner (SNAIL). But in this model, attention layer is on every top of convolutions layers. And attention size is differ from SNAIL.

Result

Dataset: Agnews without pre-processing

  • with attention: 0.82

  • without attention: 0.81

My thoughts on result

Most of simple models on agnews shows 0.81 accuracy. (Which tested on A Structured Self-Attentive Sentence Embedding, TagSpace and it uses word based embedding)

So 0.82 accuracy with character based model seems worthiness.


上一篇:TF_TCN

下一篇: Tensorflow-TCN

用户评价
全部评价

热门资源

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • shih-styletransfer

    shih-styletransfer Code from Style Transfer ...