资源算法nonauto-nmt

nonauto-nmt

2019-09-10 | |  102 |   0 |   0

Non-Autoregressive Transformer

Code release for Non-Autoregressive Neural Machine Translation by Jiatao Gu, James Bradbury, Caiming Xiong, Victor O.K. Li, and Richard Socher.

Requires PyTorch 0.3, torchtext 0.2.1, and SpaCy.

The pipeline for training a NAT model for a given language pair includes: 1. run_alignment_wmt_LANG.sh (runs fast_align for alignment supervision) 2. run_LANG.sh (trains an autoregressive model) 3. run_LANG_decode.sh (produces the distillation corpus for training the NAT) 4. run_LANG_fast.sh (trains the NAT model) 5. run_LANG_fine.sh (fine-tunes the NAT model)


上一篇:GAN-weight-norm

下一篇:MobileNetv2

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...