Title
Randomization Vs Optimization In Svm Ensembles
Abstract
Ensembles of SVMs are notoriously difficult to build because of the stability of the model provided by a single SVM. The application of standard bagging or boosting algorithms generally leads to small accuracy improvements at a computational cost that increases with the size of the ensemble. In this work, we leverage on subsampling and the diversification of hyperparameters through optimization and randomization to build SVM ensembles at a much lower computational cost than training a single SVM on the same data. Furthermore, the accuracy of these ensembles is comparable to a single SVM and to a fully optimized SVM ensemble.
Year
DOI
Venue
2018
10.1007/978-3-030-01421-6_40
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2018, PT II
Keywords
Field
DocType
Ensemble learning, Support vector machines, Randomization
Pattern recognition,Hyperparameter,Computer science,Support vector machine,Boosting (machine learning),Artificial intelligence,Ensemble learning,Machine learning
Conference
Volume
ISSN
Citations 
11140
0302-9743
0
PageRank 
References 
Authors
0.34
10
3
Name
Order
Citations
PageRank
Maryam Sabzevari1102.57
Gonzalo Martínez-Muñoz252423.76
Alberto Suárez348722.33