Title
GENSF: Simultaneous Adaptation of Generative Pre-trained Models and Slot Filling
Abstract
In transfer learning, it is imperative to achieve strong alignment between a pre-trained model and a downstream task. Prior work has done this by proposing task-specific pre-training objectives, which sacrifices the inherent scalability of the transfer learning paradigm. We instead achieve strong alignment by simultaneously modifying both the pre-trained model and the formulation of the downstream task, which is more efficient and preserves the scalability of transfer learning. We present GENSF (Generative Slot Filling), which leverages a generative pre-trained open-domain dialog model for slot filling. GENSF (1) adapts the pre-trained model by incorporating inductive biases about the task and (2) adapts the downstream task by reformulating slot filling to better leverage the pre-trained model's capabilities. GENSF achieves state-of-the-art results on two slot filling datasets with strong gains in few-shot and zero-shot settings. We achieve a 9 F-1 score improvement in zeroshot slot filling. This highlights the value of strong alignment between the pre-trained model and the downstream task.
Year
Venue
DocType
2021
SIGDIAL 2021: 22ND ANNUAL MEETING OF THE SPECIAL INTEREST GROUP ON DISCOURSE AND DIALOGUE (SIGDIAL 2021)
Conference
Citations 
PageRank 
References 
0
0.34
0
Authors
2
Name
Order
Citations
PageRank
Shikib Mehri100.68
Maxine Eskenazi2979127.53