Abstract | ||
---|---|---|
Dependency-based representations of natu- ral language syntax require a fine balance between structural flexibility and computa- tional complexity. In previous work, several constraints have been proposed to identify classes of dependency structures that are well- balanced in this sense; the best-known but also most restrictive of these is projectivity. Most constraints are formulated on fully spec- ified structures, which makes them hard to in- tegrate into models where structures are com- posed from lexical information. In this paper, we show how two empirically relevant relax- ations of projectivity can be lexicalized, and how combining the resulting lexicons with a regular means of syntactic composition gives rise to a hierarchy of mildly context-sensitive dependency languages. |
Year | Venue | Field |
---|---|---|
2007 | ACL | Computer science,Natural language,Artificial intelligence,Natural language processing,Hierarchy,Syntax,Computational complexity theory |
DocType | Volume | Citations |
Conference | P07-1 | 14 |
PageRank | References | Authors |
0.87 | 12 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Marco Kuhlmann | 1 | 309 | 23.06 |
Mathias Möhl | 2 | 118 | 8.08 |