Abstract | ||
---|---|---|
Ordinal regression (OR) is one of the most important machine learning tasks. The kernel method is a major technique to achieve nonlinear OR. However, traditional kernel OR solvers are inefficient due to increased complexity introduced by multiple ordinal thresholds as well as the cost of kernel computation. Doubly stochastic gradient (DSG) is a very efficient and scalable kernel learning algorithm that combines random feature approximation with stochastic functional optimization. However, the theory and algorithm of DSG can only support optimization tasks within the unique reproducing kernel Hilbert space (RKHS), which is not suitable for OR problems where the multiple ordinal thresholds usually lead to multiple RKHSs. To address this problem, we construct a kernel whose RKHS can contain the decision function with multiple thresholds. Based on this new kernel, we further propose a novel DSG-like algorithm, DSGOR. In each iteration of DSGOR, we update the decision functional as well as the function bias with appropriately set learning rates for each. Our theoretic analysis shows that DSGOR can achieve O(1/t) convergence rate, which is as good as DSG, even though dealing with a much harder problem. Extensive experimental results demonstrate that our algorithm is much more efficient than traditional kernel OR solvers, especially on large-scale problems. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/TNNLS.2020.3015937 | IEEE Transactions on Neural Networks and Learning Systems |
Keywords | DocType | Volume |
Doubly stochastic gradients (DSGs),kernel learning,ordinal regression (OR),random features | Journal | 32 |
Issue | ISSN | Citations |
8 | 2162-237X | 0 |
PageRank | References | Authors |
0.34 | 19 | 7 |
Name | Order | Citations | PageRank |
---|---|---|---|
Bin Gu | 1 | 648 | 33.45 |
Xiang Geng | 2 | 9 | 2.85 |
Xiang Li | 3 | 52 | 8.31 |
Shi Wanli | 4 | 0 | 1.35 |
Guan-Sheng Zheng | 5 | 9 | 2.94 |
Cheng Deng | 6 | 1283 | 85.48 |
Heng Huang | 7 | 3080 | 203.21 |