Abstract | ||
---|---|---|
In this paper, we propose a dual perspective of online learning algorithm, which concerns using a window method to achieve sparsity and robustness. It makes use of Fenchel conjugates and gradient ascent to perform online learning optimization process. The window method is an update strategy for the classifier. It consists of two bounds which related to the dual increase. The lower bound abandons the points which induce a smaller dual ascent while the upper bound constraints the dual increase generated by noise points to reduce their influence on the target boundary. Moreover, with the use of the window method, the prediction accuracy can be increased significantly. Detailed experiments on artificial and real world datasets verify the utility of the proposed approaches. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1145/2632856.2632922 | ICIMCS |
Keywords | Field | DocType |
algorithms,design,online learning,classifier design and evaluation,window method,theory,fenchel conjugates | Online learning,Gradient descent,Computer science,Dual ascent,Upper and lower bounds,Algorithm,Robustness (computer science),Artificial intelligence,Classifier (linguistics) | Conference |
Citations | PageRank | References |
0 | 0.34 | 10 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Min Tang | 1 | 623 | 51.33 |
Boliang Sun | 2 | 0 | 0.34 |
Li Guohui | 3 | 447 | 76.53 |