Abstract | ||
---|---|---|
There has been recent interest in applying cognitively or empirically motivated bounds on recursion depth to limit the search space of grammar induction models (Ponvert et al., 2011; Noji and Johnson, 2016; Shain et al., 2016). This work extends this depth-bounding approach to probabilistic context-free grammar induction (DB-PCFG), which has a smaller parameter space than hierarchical sequence models, and therefore more fully exploits the space reductions of depth-bounding. Results for this model on grammar acquisition from transcribed child-directed speech and newswire text exceed or are competitive with those of other models when evaluated on parse accuracy. Moreover, grammars acquired from this model demonstrate a consistent use of category labels, something which has not been demonstrated by other acquisition models. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1162/tacl_a_00016 | Transactions of the Association for Computational Linguistics |
DocType | Volume | Citations |
Journal | abs/1802.08545 | 0 |
PageRank | References | Authors |
0.34 | 4 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Lifeng Jin | 1 | 0 | 1.69 |
finale doshivelez | 2 | 574 | 51.99 |
Timothy A. Miller | 3 | 71 | 13.76 |
William Schuler | 4 | 19 | 2.39 |
Lane Schwartz | 5 | 209 | 18.01 |