Title
Similarity metric learning for a variable-kernel classifier
Abstract
Nearest-neighbor interpolation algorithms have many usefulproperties for applications to learning, but they often exhibitpoor generalization. In this paper, it is shown that much bettergeneralization can be obtained by using a variable interpolationkernel in combination with conjugate gradient optimization of thesimilarity metric and kernel size. The resulting method is calledvariable-kernel similarity metric (VSM) learning. It has beentested on several standard classification data sets, and on theseproblems it shows better generalization than backpropagation andmost other learning methods. The number of parameters that must bedetermined through optimization are orders of magnitude less thanfor backpropagation or radial basis function (RBF) networks, whichmay indicate that the method better captures the essential degreesof variation in learning. Other features of VSM learning arediscussed that make it relevant to models for biological learningin the brain.
Year
DOI
Venue
1995
10.1162/neco.1995.7.1.72
Neural Computation
Keywords
DocType
Volume
poor generalization,vsm learning,variable-kernel classifier,calledvariable-kernel similarity metric,biological learningin,conjugate gradient optimization,exhibitpoor generalization,method better capture,nearest-neighbor interpolation algorithm,nearest-neighbour interpolation algorithm,better generalization,thanfor backpropagation,similarity metric learning,kernel size,biological learning,resulting method,learning method
Journal
7
Issue
ISSN
Citations 
1
0899-7667
110
PageRank 
References 
Authors
24.49
13
1
Search Limit
100110
Name
Order
Citations
PageRank
D. G. Lowe1157181413.60