Title
Context-Tuning: Learning Contextualized Prompts for Natural Language Generation.
Abstract
Recently, pretrained language models (PLMs) have had exceptional success in language generation. To leverage the rich knowledge encoded by PLMs, a simple yet powerful paradigm is to use
Year
Venue
DocType
2022
International Conference on Computational Linguistics
Conference
Volume
Citations 
PageRank 
Proceedings of the 29th International Conference on Computational Linguistics
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Tianyi Tang100.34
Junyi Li202.70
Wayne Xin Zhao3127566.73
Ji-Rong Wen44431265.98