Title
bert2BERT: Towards Reusable Pretrained Language Models
Abstract
In recent years, researchers tend to pre-train ever-larger language models to explore the upper limit of deep models. However, large language model pre-training costs intensive computational resources, and most of the models are trained from scratch without reusing the existing pre-trained models, which is wasteful. In this paper, we propose bert2BERT1, which can effectively transfer the knowledge of an existing smaller pre-trained model to a large model through parameter initialization and significantly improve the pre-training efficiency of the large model. Specifically, we extend the previous function-preserving (Chen et al., 2016) method proposed in computer vision on the Transformer-based language model, and further improve it by proposing a novel method, advanced knowledge for the large model's initialization. In addition, a two-stage learning method is proposed to further accelerate the pre-training. We conduct extensive experiments on representative PLMs (e.g., BERT and GPT) and demonstrate that (1) our method can save a significant amount of training cost compared with baselines including learning from scratch, StackBERT (Gong et al., 2019) and MSLT (Yang et al., 2020); (2) our method is generic and applicable to different types of pretrained models. In particular, bert2BERT saves about 45% and 47% computational cost of pretraining BERTBASE and GPTBASE by reusing the models of almost their half sizes.
Year
DOI
Venue
2022
10.18653/v1/2022.acl-long.151
PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS)
DocType
Volume
Citations 
Conference
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
0
PageRank 
References 
Authors
0.34
0
10
Name
Order
Citations
PageRank
Cheng Chen111.03
Yichun Yin2272.58
Lifeng Shang348530.96
Xin Jiang415032.43
Yujia Qin501.01
Fengyu Wang6155.76
Zhi Wang700.34
Xiao Chen801.35
Zhiyuan Liu92037123.68
Qun Liu102149203.11