Abstract | ||
---|---|---|
Widely used in speech and language processing, Kneser-Ney (KN) smoothing has consistently been shown to be one of the best-performing smoothing methods. However, KN smoothing assumes integer counts, limiting its potential uses-for example, inside Expectation-Maximization. In this paper, we propose a generalization of KN smoothing that operates on fractional counts, or, more precisely, on distributions over counts. We rederive all the steps of KN smoothing to operate on count distributions instead of integral counts, and apply it to two tasks where KN smoothing was not applicable before: one in language model adaptation, and the other in word alignment. In both cases, our method improves performance significantly. |
Year | Venue | Field |
---|---|---|
2014 | PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1 | Integer,Computer science,Kneser–Ney smoothing,Algorithm,Smoothing,Artificial intelligence,Language model,Limiting,Machine learning |
DocType | Volume | Citations |
Conference | P14-1 | 9 |
PageRank | References | Authors |
0.48 | 17 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hui Zhang | 1 | 10 | 0.83 |
David Chiang | 2 | 2843 | 144.76 |