Title
Addressing the Data Sparsity Issue in Neural AMR Parsing.
Abstract
Neural attention models have achieved great success in different NLP tasks. How- ever, they have not fulfilled their promise on the AMR parsing task due to the data sparsity issue. In this paper, we de- scribe a sequence-to-sequence model for AMR parsing and present different ways to tackle the data sparsity problem. We show that our methods achieve significant improvement over a baseline neural atten- tion model and our results are also compet- itive against state-of-the-art systems that do not use extra linguistic resources.
Year
DOI
Venue
2017
10.18653/v1/e17-1035
conference of the european chapter of the association for computational linguistics
Field
DocType
Volume
Computer science,Natural language processing,Artificial intelligence,Parsing,Machine learning
Journal
abs/1702.05053
Citations 
PageRank 
References 
8
0.52
20
Authors
4
Name
Order
Citations
PageRank
Xiaochang Peng1545.31
Chuan Wang280.52
Daniel Gildea32269193.43
Nianwen Xue41654117.65