Abstract | ||
---|---|---|
While most work on parsing with PCFGs has focused on local correlations between tree configurations, we attempt to model non-local correlations using a finite mixture of PCFGs. A mixture grammar fit with the EM algorithm shows improvement over a single PCFG, both in parsing accuracy and in test data likelihood. We argue that this improvement comes from the learning of specialized grammars that capture non-local correlations. |
Year | Venue | Keywords |
---|---|---|
2006 | CoNLL | single pcfg,specialized grammar,parsing accuracy,finite mixture,local correlation,non-local modeling,test data likelihood,mixture grammar fit,capture non-local correlation,non-local correlation,em algorithm |
Field | DocType | Citations |
Rule-based machine translation,Expectation–maximization algorithm,Computer science,Grammar,Speech recognition,Artificial intelligence,Test data,Natural language processing,Parsing,Finite mixture,Machine learning | Conference | 0 |
PageRank | References | Authors |
0.34 | 11 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Slav Petrov | 1 | 2405 | 107.56 |
Leon Barrett | 2 | 520 | 25.02 |
Dan Klein | 3 | 8083 | 495.21 |