Title
Paired Examples as Indirect Supervision in Latent Decision Models.
Abstract
Compositional, structured models are appealing because they explicitly decompose problems and provide interpretable intermediate outputs that give confidence that the model is not simply latching onto data artifacts. Learning these models is challenging, however, because end-task supervision only provides a weak indirect signal on what values the latent decisions should take. This often results in the model failing to learn to perform the intermediate tasks correctly. In this work, we introduce a way to leverage paired examples that provide stronger cues for learning latent decisions. When two related training examples share internal substructure, we add an additional training objective to encourage consistency between their latent decisions. Such an objective does not require external supervision for the values of the latent output, or even the end task, yet provides an additional training signal to that provided by individual training examples themselves. We apply our method to improve compositional question answering using neural module networks on the DROP dataset. We explore three ways to acquire paired questions in DROP: (a) discovering naturally occurring paired examples within the dataset, (b) constructing paired examples using templates, and (c) generating paired examples using a question generation model. We empirically demonstrate that our proposed approach improves both in- and out-of-distribution generalization and leads to correct latent decision predictions.
Year
Venue
Keywords
2021
EMNLP
Decision model,Question answering,Machine learning,Computer science,Leverage (finance),Artificial intelligence,Question generation,Signal on
DocType
Volume
Citations 
Conference
2021.emnlp-main
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Nitish Gupta1174.70
Sameer Singh2106071.63
Matthew Gardner370438.49
Dan Roth47735695.19