Abstract | ||
---|---|---|
We present an online Support Vector Machine (SVM) that uses Stochastic Meta-Descent (SMD) to adapt its step size automatically. We formulate the online learning problem as a stochastic gradient descent in Reproducing Kernel Hilbert Space (RKHS) and translate SMD to the nonparametric setting, where its gradient trace parameter is no longer a coefficient vector but an element of the RKHS. We derive efficient updates that allow us to perform the step size adaptation in linear time. We apply the online SVM framework to a variety of loss functions and in particular show how to achieve efficient online multiclass classification. Experimental evidence suggests that our algorithm outperforms existing methods. |
Year | DOI | Venue |
---|---|---|
2005 | 10.1109/ISSPA.2005.1581065 | ISSPA 2005: THE 8TH INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND ITS APPLICATIONS, VOLS 1 AND 2, PROCEEDINGS |
Keywords | Field | DocType |
statistics,algorithms,machine learning,metadata,support vector,kernel,hilbert space,vector quantization,support vector machines,stochastic processes | Kernel (linear algebra),Online machine learning,Stochastic gradient descent,Pattern recognition,Computer science,Support vector machine,Vector quantization,Artificial intelligence,Time complexity,Machine learning,Reproducing kernel Hilbert space,Multiclass classification | Conference |
Citations | PageRank | References |
0 | 0.34 | 3 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Alexandros Karatzoglou | 1 | 1522 | 68.76 |
S. V. N. Vishwanathan | 2 | 1991 | 131.90 |
Nicol N. Schraudolph | 3 | 1185 | 164.26 |
Alexander J. Smola | 4 | 19627 | 1967.09 |