ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding | 0 | 0.34 | 2021 |
ERNIE-M - Enhanced Multilingual Representation by Aligning Cross-lingual Semantics with Monolingual Corpora. | 0 | 0.34 | 2021 |
Ernie 2.0: A Continual Pre-Training Framework For Language Understanding | 1 | 0.34 | 2020 |