Abstract | ||
---|---|---|
Measuring the information retrieval effectiveness of Web search engines can be expensive if human relevance judgments are required to evaluate search results. Using implicit user feedback for search engine evaluation provides a cost and time effective manner of addressing this problem. Web search engines can use human evaluation of search results without the expense of human evaluators. An additional advantage of this approach is the availability of real time data regarding system performance. Wecapture user relevance judgments actions such as print, save and bookmark, sending these actions and the corresponding document identifiers to a central server via a client application. We use this implicit feedback to calculate performance metrics, such as precision. We can calculate an overall system performance metric based on a collection of weighted metrics. |
Year | DOI | Venue |
---|---|---|
2005 | 10.1145/1076034.1076172 | SIGIR |
Keywords | Field | DocType |
automated evaluation,search engine evaluation,wecapture user relevance judgment,search engine performance,implicit user feedback,system performance,web search engine,human evaluation,human evaluator,performance metrics,overall system performance,human relevance judgment,search result,real time data,information retrieval,search engine | Data mining,Metasearch engine,Search engine,Information retrieval,Real-time data,Identifier,Computer science,Performance metric,Search engine indexing,Search analytics | Conference |
ISBN | Citations | PageRank |
1-59593-034-5 | 14 | 0.64 |
References | Authors | |
4 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Himanshu Sharma | 1 | 20 | 2.58 |
Bernard J. Jansen | 2 | 4753 | 394.06 |