Abstract | ||
---|---|---|
This paper presents a distance metric learning method for k-nearest neighbors regression. We define the constraints based on triplets, which are built from the neighborhood of each training instance, to learn the distance metric. The resulting optimization problem can be formulated as a convex quadratic program. Quadratic programming has a disadvantage that it does not scale well in large-scale settings. To reduce the time complexity of training, we propose a novel dual coordinate descent method for this type of problem. Experimental results on several regression data sets show that our method obtains a competitive performance when compared with the state-of-the-art distance metric learning methods, while being an order of magnitude faster. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1016/j.neucom.2016.07.005 | Neurocomputing |
Keywords | Field | DocType |
Nearest neighbor,Distance metric learning,Regression,Quadratic programming | k-nearest neighbors algorithm,Chebyshev distance,Mathematical optimization,Neighbourhood components analysis,Metric (mathematics),Artificial intelligence,Quadratic programming,Coordinate descent,Time complexity,Optimization problem,Machine learning,Mathematics | Journal |
Volume | Issue | ISSN |
214 | C | 0925-2312 |
Citations | PageRank | References |
8 | 0.46 | 33 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Bac Nguyen | 1 | 54 | 3.75 |
Carlos Morell | 2 | 110 | 13.88 |
Bernard De Baets | 3 | 2994 | 300.39 |