Title | ||
---|---|---|
Efficient gradient descent algorithm for sparse models with application in learning-to-rank |
Abstract | ||
---|---|---|
Recently, learning-to-rank has attracted considerable attention. Although significant research efforts have been focused on learning-to-rank, it is not the case for the problem of learning sparse models for ranking. In this paper, we consider the sparse learning-to-rank problem. We formulate it as an optimization problem with the @?"1 regularization, and develop a simple but efficient iterative algorithm to solve the optimization problem. Experimental results on four benchmark datasets demonstrate that the proposed algorithm shows (1) superior performance gain compared to several state-of-the-art learning-to-rank algorithms, and (2) very competitive performance compared to FenchelRank that also learns a sparse model for ranking. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1016/j.knosys.2013.06.001 | Knowl.-Based Syst. |
Keywords | Field | DocType |
efficient iterative algorithm,sparse model,benchmark datasets,competitive performance,superior performance gain,proposed algorithm shows,optimization problem,sparse learning-to-rank problem,considerable attention,state-of-the-art learning-to-rank algorithm,efficient gradient descent algorithm,information retrieval | Learning to rank,Gradient descent,Mathematical optimization,Ranking,Sparse model,Iterative method,Computer science,Sparse approximation,Regularization (mathematics),Artificial intelligence,Optimization problem,Machine learning | Journal |
Volume | ISSN | Citations |
49, | 0950-7051 | 4 |
PageRank | References | Authors |
0.43 | 24 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hanjiang Lai | 1 | 234 | 17.67 |
Yan Pan | 2 | 179 | 19.23 |
Yong Tang | 3 | 22 | 2.06 |
Ning Liu | 4 | 47 | 4.82 |