Title
Unsupervised Dual Paraphrasing for Two-stage Semantic Parsing
Abstract
One daunting problem for semantic parsing is the scarcity of annotation. Aiming to reduce nontrivial human labor, we propose a two-stage semantic parsing framework, where the first stage utilizes an unsupervised paraphrase model to convert an unlabeled natural language utterance into the canonical utterance. The downstream naive semantic parser accepts the intermediate output and returns the target logical form. Furthermore, the entire training process is split into two phases: pre-training and cycle learning. Three tailored self-supervised tasks are introduced throughout training to activate the unsupervised paraphrase model. Experimental results on benchmarks Overnight and GeoGranno demonstrate that our framework is effective and compatible with supervised training.
Year
Venue
DocType
2020
ACL
Conference
Volume
Citations 
PageRank 
2020.acl-main
0
0.34
References 
Authors
0
8
Name
Order
Citations
PageRank
Ruisheng Cao101.35
Su Zhu2447.48
Chenyu Yang300.34
Chen Liu415125.89
Rao Ma501.35
Yanbin Zhao601.35
Lu Chen700.34
Kai Yu8108290.58