Title
Combining Independent Modules in Lexical Multiple-Choice Problems
Abstract
Existing statistical approaches to natural language problems are very coarse approximations to the true complexity of language processing. As such, no single technique will be best for all problem instances. Many researchers are examining ensemble methods that combine the output of multiple modules to create more accurate solutions. This paper examines three merging rules for combining probability distributions: the familiar mixture rule, the logarithmic rule, and a novel product rule. These rules were applied with state-of-the-art results to two problems used to assess human mastery of lexical semantics - synonym questions and analogy questions. All three merging rules result in ensembles that are more accurate than any of their component modules. The differences among the three rules are not statistically significant, but it is suggestive that the popular mixture rule is not the best rule for either of the two problems.
Year
Venue
Keywords
2003
RECENT ADVANCES IN NATURAL LANGUAGE PROCESSING III
language,lexical semantics,computational linguistics,multiple choice,statistical models,natural language,semantics,information retrieval,machine learning,statistical significance,probability distribution
DocType
Volume
ISSN
Conference
260
0304-0763
Citations 
PageRank 
References 
8
0.50
15
Authors
4
Name
Order
Citations
PageRank
Peter D. Turney16084534.36
Michael L. Littman29798961.84
jeffrey p bigham32647189.29
Victor Shnayder41393117.15