Title
Cross-Lingual BERT Transformation for Zero-Shot Dependency Parsing
Abstract
This paper investigates the problem of learning cross-lingual representations in a contextual space. We propose Cross-Lingual BERT Transformation (CLBT), a simple and efficient approach to generate cross-lingual contextualized word embeddings based on publicly available pre-trained BERT models (Devlin et al., 2018). In this approach, a linear transformation is learned from contextual word alignments to align the contextualized embeddings independently trained in different languages. We demonstrate the effectiveness of this approach on zero-shot cross-lingual transfer parsing. Experiments show that our embeddings substantially outperform the previous state-of-the-art that uses static embeddings. We further compare our approach with XLM (Lample and Conneau, 2019), a recently proposed cross-lingual language model trained with massive parallel data, and achieve highly competitive results.
Year
DOI
Venue
2019
10.18653/v1/D19-1575
EMNLP/IJCNLP (1)
DocType
Volume
Citations 
Conference
D19-1
1
PageRank 
References 
Authors
0.35
0
5
Name
Order
Citations
PageRank
Yuxuan Wang114412.04
Wanxiang Che271166.39
Jiang Guo3335.74
Yijia Liu4497.34
Ting Liu52735232.31