Abstract | ||
---|---|---|
A major hurdle faced by many information retrieval researchers---especially in academia---is evaluating retrieval systems in the wild. Challenges include tapping into large user bases, collecting user behavior, and modifying a given retrieval system. We outline several options available to researchers to overcome these challenges along with their advantages and disadvantages. We then demonstrate how CrowdLogger, an open-source browser extension for Firefox and Google Chrome, can be used as an in situ evaluation platform. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1145/2513150.2513164 | LivingLab@CIKM |
Field | DocType | Citations |
Data mining,World Wide Web,Information retrieval,Computer science,User studies | Conference | 2 |
PageRank | References | Authors |
0.37 | 13 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Henry A. Feild | 1 | 329 | 18.08 |
James F. Allen | 2 | 9929 | 1631.65 |