Title
Approximation-Aware Dependency Parsing by Belief Propagation.
Abstract
We show how to train the fast dependency parser of Smith and Eisner (2008) for improved accuracy. This parser can consider higher-order interactions among edges while retaining O(n^3) runtime. It outputs the parse with maximum expected recall -- but for speed, this expectation is taken under a posterior distribution that is constructed only approximately, using loopy belief propagation through structured factors. We show how to adjust the model parameters to compensate for the errors introduced by this approximation, by following the gradient of the actual loss on training data. We find this gradient by back-propagation. That is, we treat the entire parser (approximations and all) as a differentiable circuit, as Stoyanov et al. (2011) and Domke (2010) did for loopy CRFs. The resulting trained parser obtains higher accuracy with fewer iterations of belief propagation than one trained by conditional log-likelihood.
Year
Venue
Field
2015
Trans. Assoc. Comput. Linguistics
Training set,Computer science,Dependency grammar,Posterior probability,Differentiable function,Artificial intelligence,Natural language processing,Parsing,Backpropagation,CRFS,Machine learning,Belief propagation
DocType
Volume
Citations 
Journal
3
6
PageRank 
References 
Authors
0.55
27
3
Name
Order
Citations
PageRank
Matthew Gormley18410.25
Mark Dredze23092176.22
Jason Eisner31825173.00