Abstract | ||
---|---|---|
Research in Information Retrieval shows performance improvement when many sources of evidence are combined to produce a ranking of documents. Most current approaches assess document relevance by computing a single score which aggregates values of some attributes or criteria. We propose a multiple criteria framework using an aggregation mechanism based on decision rules identifying positive and negative reasons for judging whether a document should get a better ranking than another. The resulting procedure also handles imprecision in criteria design. Experimental results are reported. |
Year | DOI | Venue |
---|---|---|
2006 | 10.1007/11880561_20 | SPIRE |
Keywords | Field | DocType |
aggregates value,decision rule,multiple criteria framework,information retrieval,criteria design,aggregation mechanism,document relevance,better ranking,multiple criterion,current approach,relevance | Decision rule,Data mining,Multiple criteria,Ranking,Information retrieval,Computer science,Expert system,Ranking (information retrieval),Knowledge base,String (computer science),Performance improvement | Conference |
Volume | ISSN | ISBN |
4209 | 0302-9743 | 3-540-45774-7 |
Citations | PageRank | References |
4 | 0.47 | 15 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Mohamed Farah | 1 | 12 | 1.31 |
Daniel Vanderpooten | 2 | 1153 | 74.66 |