资源算法Knowledge-distillation-using-BERT

Knowledge-distillation-using-BERT

2020-04-13 | |  48 |   0 |   0

Knowledge-distillation-using-BERT

Using Google's pretrained language model, BERT, I aim to train a smaller bidirectional LSTM model with knowledge distillation. The kaggle dataset from Toxic Comment classification is used for this task.


上一篇:Bert-Summarization-with-Flask

下一篇:Keras-Bert-Kbqa

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...