Title
Smoothing NDCG metrics using tied scores
Abstract
One of promising directions in research on learning to rank concerns the problem of appropriate choice of the objective function to maximize by means of machine learning algorithms. We describe a novel technique of smoothing an arbitrary ranking metric and demonstrate how to utilize it to maximize the retrieval quality in terms of the $NDCG$ metric. The idea behind our listwise ranking model called TieRank is artificial probabilistic tying of predicted relevance scores at each iteration of learning process, which defines a distribution on the set of all permutations of retrieved documents. Such distribution provides a desired smoothed version of the target retrieval quality metric. This smooth function is possible to maximize using a gradient descent method. Experiments on LETOR collections show that TieRank outperforms most of the existing learning to rank algorithms.
Year
DOI
Venue
2011
10.1145/2063576.2063888
CIKM
Keywords
Field
DocType
novel technique,artificial probabilistic,smooth function,target retrieval quality metric,smoothing ndcg metrics,letor collection,retrieval quality,gradient descent method,listwise ranking model,appropriate choice,objective function,learning to rank,information retrieval,machine learning
Online machine learning,Data mining,Learning to rank,Gradient descent,Ranking SVM,Ranking,Computer science,Permutation,Smoothing,Artificial intelligence,Probabilistic logic,Machine learning
Conference
Citations 
PageRank 
References 
1
0.41
5
Authors
6
Name
Order
Citations
PageRank
Andrey Kustarev1312.26
Yury Ustinovskiy2294.59
Yury Logachev341.50
Evgeny Grechnikov410.41
Ilya Segalovich51439.67
Pavel Serdyukov6134190.10