Abstract | ||
---|---|---|
This paper shows that discriminative reranking with an averaged perceptron model yields substantial improvements in realization quality with CCG. The paper confirms the utility of including language model log probabilities as features in the model, which prior work on discriminative training with log linear models for HPSG realization had called into question. The perceptron model allows the combination of multiple n-gram models to be optimized and then augmented with both syntactic features and discriminative n-gram features. The full model yields a state-of-the-art BLEU score of 0.8506 on Section 23 of the CCGbank, to our knowledge the best score reported to date using a reversible, corpus-engineered grammar. |
Year | Venue | Keywords |
---|---|---|
2009 | EMNLP | hpsg realization,perceptron model,multiple n-gram model,perceptron reranking,discriminative reranking,ccg realization,perceptron model yield,discriminative training,discriminative n-gram feature,log linear model,language model log probability,full model yield,language model |
Field | DocType | Volume |
Head-driven phrase structure grammar,Computer science,Grammar,Speech recognition,Artificial intelligence,Natural language processing,Log-linear model,Discriminative model,Syntax,Perceptron,Machine learning,Language model | Conference | D09-1 |
Citations | PageRank | References |
24 | 1.02 | 28 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Michael White | 1 | 101 | 7.24 |
Rajakrishnan Rajkumar | 2 | 94 | 6.72 |