Abstract | ||
---|---|---|
Within the area of Information Retrieval (IR) the importance of appropriate ranking of results has increased markedly. The importance is magnified in the case of systems dedicated to XML retrieval, since users of these systems expect the retrieval of highly relevant and highly precise components, instead of the retrieval of entire documents. As an international, coordinated effort to evaluate the performance of Information Retrieval systems, the Initiative for the Evaluation of XML Retrieval (INEX) encourages participating organisation to run queries on their search engines and to submit their result for the annual INEX workshop. In previous INEX workshops the submitted results were manually assessed by participants and the search engines were ranked in terms of performance. This paper presents a Collective Ranking Strategy that outperforms all search engines it is based on. Moreover it provides a system that is trying to facilitate the ranking of participating search engines. |
Year | DOI | Venue |
---|---|---|
2004 | 10.1007/11424550_10 | INEX |
Keywords | Field | DocType |
entire document,appropriate ranking,information retrieval system,collective ranking strategy,xml retrieval,annual inex workshop,information retrieval,search engine,xml document,previous inex,precise component | World Wide Web,Human–computer information retrieval,Okapi BM25,Information retrieval,Query expansion,Computer science,Ranking (information retrieval),Relevance (information retrieval),Document retrieval,Adversarial information retrieval,Concept search | Conference |
Volume | ISSN | ISBN |
3493 | 0302-9743 | 3-540-26166-4 |
Citations | PageRank | References |
1 | 0.35 | 6 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Maha Salem | 1 | 217 | 13.30 |
Alan Woodley | 2 | 10 | 6.06 |
Shlomo Geva | 3 | 658 | 90.59 |