Abstract | ||
---|---|---|
Recently, pretrained language models (PLMs) have had exceptional success in language generation. To leverage the rich knowledge encoded by PLMs, a simple yet powerful paradigm is to use |
Year | Venue | DocType |
---|---|---|
2022 | International Conference on Computational Linguistics | Conference |
Volume | Citations | PageRank |
Proceedings of the 29th International Conference on Computational Linguistics | 0 | 0.34 |
References | Authors | |
0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Tianyi Tang | 1 | 0 | 0.34 |
Junyi Li | 2 | 0 | 2.70 |
Wayne Xin Zhao | 3 | 1275 | 66.73 |
Ji-Rong Wen | 4 | 4431 | 265.98 |