资源论文CROSS -L INGUAL ABILITY OF MULTILINGUAL BERT:A NE MPIRICAL STUDY

CROSS -L INGUAL ABILITY OF MULTILINGUAL BERT:A NE MPIRICAL STUDY

2019-12-31 | |  77 |   42 |   0

Abstract

Recent work has exhibited the surprising cross-lingual abilities of multilingual BERT (M-BERT) – surprising since it is trained without any cross-lingual objective and with no aligned data. In this work, we provide a comprehensive study of the contribution of different components in M-BERT to its cross-lingual ability. We study the impact of linguistic properties of the languages, the architecture of the model, and the learning objectives. The experimental study is done in the context of three typologically different languages – Spanish, Hindi, and Russian – and using two conceptually different NLP tasks, textual entailment and named entity recognition. Among our key conclusions is the fact that the lexical overlap between languages plays a negligible role in the cross-lingual success, while the depth of the network is an integral part of it.

上一篇:IMPROVING NEURAL LANGUAGE GENERATION WITHS PECTRUM CONTROL

下一篇:ACTION SEMANTICS NETWORK :C ONSIDERING THEE FFECTS OF ACTIONS IN MULTIAGENT SYSTEMS

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...