资源论文Massively Multilingual Transfer for NER

Massively Multilingual Transfer for NER

2019-09-19 | |  110 |   40 |   0 0 0
Abstract In cross-lingual transfer, NLP models over one or more source languages are applied to a lowresource target language. While most prior work has used a single source model or a few carefully selected models, here we consider a “massive” setting with many such models. This setting raises the problem of poor transfer, particularly from distant languages. We propose two techniques for modulating the transfer, suitable for zero-shot or few-shot learning, respectively. Evaluating on named entity recognition, we show that our techniques are much more effective than strong baselines, including standard ensembling, and our unsupervised method rivals oracle selection of the single best individual model

上一篇:Learning Transferable Feature Representations Using Neural Networks

下一篇:Multi-Source Cross-Lingual Model Transfer: Learning What to Share

用户评价
全部评价

热门资源

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Rating-Boosted La...

    The performance of a recommendation system reli...

  • Hierarchical Task...

    We extend hierarchical task network planning wi...