资源算法BERT-Keyword-Extractor

BERT-Keyword-Extractor

2020-03-10 | |  50 |   0 |   0

Deep Keyphrase Extraction using BERT

I have used BERT Token Classification Model to extract keywords from a sentence. Feel free to clone and use it. If you face any problems, kindly post it on issues section.

Special credits to BERT authors: Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova, original repo and Huggingface for PyTorch version original repo.

Requirements

You need:

pytorch 1.0
python 3.6
pytorch-pretrained-bert 0.4.0

Usage

The keyword-extractor.py script can be used to extract keywords from a sentence and accepts the following arguments:

optional arguments:
  -h, --help         show this help message and exit
  --sentence SEN        sentence to extract keywords
  --path LOAD        path to load model from

Example:

python keyword-extractor.py --sentence "BERT is a great model." --path "model.pt"

Training

You can also train it from scratch using BERT's pre-trained model. The main.py script can be utilized for training and accepts the following arguments:

optional arguments:
  -h, --help         show this help message and exit
  --data DATA        location of the data corpus
  --lr LR            initial learning rate
  --epochs EPOCHS    upper epoch limit
  --batch_size N     batch size
  --seq_len N        sequence length
  --save SAVE        path to save the final model

Example:

python main.py --data "maui-semeval2010-train" --lr 2e-5 --batch_size 32 --save "model.pt" --epochs 3

This model has been trained on SemEval 2010 dataset (scientific publications). You can swap this with your own custom dataset.

Code explanations

I have provided the explanation of keyphrase extraction in the form of python notebook which you can view here

Hyper-parameter Tuning

I ran ablation experiments according to the BERT paper and these are the results. I suggest to use parameters in line 4. All training was done on batch size of 32.

Learning RateNumber of EpochsValidation lossValidation AccuracyF1-Score
3.00E-0530.0529472451598.30%0.5318559557
5.00E-0530.0489971935798.47%0.56218628
2.00E-0530.0573345946298.15%0.4390547264
3.00E-0540.0502046771298.48%0.5528169014
5.00E-0540.0519457655598.43%0.5780836421
2.00E-0540.0537348168198.25%0.5019740553


上一篇:BERT-tensorflow

下一篇:Bert-BiLSTM-CRF-pytorch

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...