TCN-with-attention
Temporal Convolutional Network with attention layer
Concept of model is mostly like Simple Neural Attentive Meta-Learner (SNAIL). But in this model, attention layer is on every top of convolutions layers. And attention size is differ from SNAIL.
Dataset: Agnews without pre-processing
with attention: 0.82
without attention: 0.81
Most of simple models on agnews shows 0.81 accuracy. (Which tested on A Structured Self-Attentive Sentence Embedding, TagSpace and it uses word based embedding)
So 0.82 accuracy with character based model seems worthiness.
上一篇:TF_TCN
下一篇: Tensorflow-TCN
还没有评论,说两句吧!
热门资源
seetafaceJNI
项目介绍 基于中科院seetaface2进行封装的JAVA...
spark-corenlp
This package wraps Stanford CoreNLP annotators ...
Keras-ResNeXt
Keras ResNeXt Implementation of ResNeXt models...
capsnet-with-caps...
CapsNet with capsule-wise convolution Project ...
shih-styletransfer
shih-styletransfer Code from Style Transfer ...
智能在线
400-630-6780
聆听.建议反馈
E-mail: support@tusaishared.com