Title
Can Monolingual Pretrained Models Help Cross-Lingual Classification?
Abstract
Multilingual pretrained language models (such as multilingual BERT) have achieved impressive results for cross-lingual transfer. However, due to the constant model capacity, multilingual pre-training usually lags behind the monolingual competitors. In this work, we present two approaches to improve zero-shot cross-lingual classification, by transferring the knowledge from monolingual pretrained models to multilingual ones. Experimental results on two cross-lingual classification benchmarks show that our methods outperform vanilla multilingual fine-tuning.
Year
Venue
DocType
2020
AACL/IJCNLP
Conference
Citations 
PageRank 
References 
0
0.34
0
Authors
5
Name
Order
Citations
PageRank
Chi Zewen100.34
Li Dong258231.86
Furu Wei31956107.57
Xian-Ling Mao49925.19
Heyan Huang517361.47