Abstract | ||
---|---|---|
We present structured perceptron training for neural network transition-based dependency parsing. We learn the neural network representation using a gold corpus augmented by a large number of automatically parsed sentences. Given this fixed network representation, we learn a final layer using the structured perceptron with beam-search decoding. On the Penn Treebank, our parser reaches 94.26% unlabeled and 92.41% labeled attachment accuracy, which to our knowledge is the best accuracy on Stanford Dependencies to date. We also provide in-depth ablative analysis to determine which aspects of our model provide the largest gains in accuracy. |
Year | DOI | Venue |
---|---|---|
2015 | 10.3115/v1/p15-1032 | PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 |
Field | DocType | Volume |
Computer science,Dependency grammar,Speech recognition,Artificial intelligence,Treebank,Natural language processing,Decoding methods,Parsing,Artificial neural network,Perceptron | Journal | abs/1506.06158 |
Citations | PageRank | References |
82 | 2.36 | 23 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
David J. Weiss | 1 | 446 | 19.11 |
Chris Alberti | 2 | 227 | 9.86 |
Michael Collins | 3 | 6788 | 785.35 |
Slav Petrov | 4 | 2405 | 107.56 |