Title
Ernie 2.0: A Continual Pre-Training Framework For Language Understanding
Abstract
Recently pre-trained models have achieved state-of-the-art results in various language understanding tasks. Current pre-training procedures usually focus on training the model with several simple tasks to grasp the co-occurrence of words or sentences. However, besides co-occurring information, there exists other valuable lexical, syntactic and semantic information in training corpora, such as named entities, semantic closeness and discourse relations. In order to extract the lexical, syntactic and semantic information from training corpora, we propose a continual pre-training framework named ERNIE 2.0 which incrementally builds pre-training tasks and then learn pre-trained models on these constructed tasks via continual multi-task learning. Based on this framework, we construct several tasks and train the ERNIE 2.0 model to capture lexical, syntactic and semantic aspects of information in the training data. Experimental results demonstrate that ERNIE 2.0 model outperforms BERT and XLNet on 16 tasks including English tasks on GLUE benchmarks and several similar tasks in Chinese. The source codes and pre-trained models have been released at https://github.com/PaddlePaddle/ERNIE.
Year
Venue
DocType
2020
national conference on artificial intelligence
Conference
Volume
ISSN
Citations 
34
2159-5399
1
PageRank 
References 
Authors
0.34
0
7
Name
Order
Citations
PageRank
Yu Sun144.09
Shuohuan Wang243.76
Yukun Li3257.06
Shikun Feng453.78
Hao Tian511.02
Hua Wu666459.26
Haifeng Wang780694.25