Title
Textbrewer: An Open-Source Knowledge Distillation Toolkit For Natural Language Processing
Abstract
In this paper, we introduce TextBrewer, an open-source knowledge distillation toolkit designed for natural language processing. It works with different neural network models and supports various kinds of supervised learning tasks, such as text classification, reading comprehension, sequence labeling. TextBrewer provides a simple and uniform workflow that enables quick setting up of distillation experiments with highly flexible configurations. It offers a set of predefined distillation methods and can be extended with custom code. As a case study, we use TextBrewer to distill BERT on several typical NLP tasks. With simple configurations, we achieve results that are comparable with or even higher than the public distilled BERT models with similar numbers of parameters.(1)
Year
DOI
Venue
2020
10.18653/v1/2020.acl-demos.2
ACL (demo)
DocType
Volume
Citations 
Conference
2020.acl-demos
0
PageRank 
References 
Authors
0.34
0
7
Name
Order
Citations
PageRank
Ziqing Yang100.68
Yiming Cui28713.40
zhipeng chen38310.24
Wanxiang Che471166.39
Ting Liu52735232.31
Shijin Wang618031.56
Guoping Hu730937.32