XLM-E: Cross-lingual Language Model Pre-training via ELECTRA | 2 | 0.36 | 2022 |
Multilingual Machine Translation Systems from Microsoft for WMT21 Shared Task. | 0 | 0.34 | 2021 |
Allocating Large Vocabulary Capacity for Cross-Lingual Language Model Pre-Training. | 0 | 0.34 | 2021 |
mT6 - Multilingual Pretrained Text-to-Text Transformer with Translation Pairs. | 0 | 0.34 | 2021 |
InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training | 0 | 0.34 | 2021 |