Title
A comparison of models of word meaning in context
Abstract
This paper compares a number of recently proposed models for computing context sensitive word similarity. We clarify the connections between these models, simplify their formulation and evaluate them in a unified setting. We show that the models are essentially equivalent if syntactic information is ignored, and that the substantial performance differences previously reported disappear to a large extent when these simplified variants are evaluated under identical conditions. Furthermore, our reformulation allows for the design of a straightforward and fast implementation.
Year
Venue
Keywords
2012
HLT-NAACL
word meaning,syntactic information,substantial performance difference,fast implementation,large extent,unified setting,context sensitive word similarity,identical condition
Field
DocType
Citations 
Computer science,Natural language processing,Artificial intelligence,Syntax
Conference
8
PageRank 
References 
Authors
0.56
11
3
Name
Order
Citations
PageRank
Georgiana Dinu151033.36
Stefan Thater275638.54
Sören Laue311411.79