Abstract | ||
---|---|---|
In this paper, we propose a new language model, namely, a dependency structure language model, for topic detection and tracking (TDT) to compensate for weakness of unigram and bigram language models. The dependency structure language model is based on the Chow expansion theory and the dependency parse tree generated by a linguistic parser. So, long-distance dependencies can be naturally captured by the dependency structure language model. We carried out extensive experiments to verify the proposed model on topic tracking and link detection in TDT. In both cases, the dependency structure language models perform better than strong baseline approaches. |
Year | DOI | Venue |
---|---|---|
2007 | 10.1016/j.ipm.2006.02.007 | Inf. Process. Manage. |
Keywords | Field | DocType |
long-distance dependency,term dependence,link detection,topic tracking,new language model,dependency parse tree,topic detection and tracking,chow expansion theory,dependency structure language model,topic detection,bigram language model,dependency parsing,language model | Parse tree,Computer science,Computational linguistics,Speech recognition,Information extraction,Statistical model,Bigram,Natural language processing,Artificial intelligence,Parsing,Constructed language,Language model | Journal |
Volume | Issue | ISSN |
43 | 5 | Information Processing and Management |
Citations | PageRank | References |
13 | 0.69 | 13 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Changki Lee | 1 | 279 | 26.18 |
Gary Geunbae Lee | 2 | 932 | 93.23 |
Myung-gil Jang | 3 | 173 | 17.43 |