Title
Online AdaBoost-based methods for multiclass problems
Abstract
Boosting is a technique forged to transform a set of weak classifiers into a strong ensemble. To achieve this, the components are trained with different data samples and the hypotheses are aggregated in order to perform a better prediction. The use of boosting in online environments is a comparatively new activity, inspired by its success in offline environments, which is emerging to meet new demands. One of the challenges is to make the methods handle significant amounts of information taking into account computational constraints. This paper proposes two new online boosting methods: the first aims to perform a better weight distribution of the instances to closely match the behavior of AdaBoost.M1 whereas the second focuses on multiclass problems and is based on AdaBoost.M2. Theoretical arguments were used to demonstrate their convergence and also that both methods retain the main features of their traditional counterparts. In addition, we performed experiments to compare the accuracy as well as the memory usage of the proposed methods against other approaches using 20 well-known datasets. Results suggest that, in many different situations, the proposed algorithms maintain high accuracies, outperforming the other tested methods.
Year
DOI
Venue
2020
10.1007/s10462-019-09696-6
Artificial Intelligence Review
Keywords
Field
DocType
Boosting, Multiclass, Online learning, Data streams
Convergence (routing),Online learning,Data mining,Data stream mining,AdaBoost,Computer science,Artificial intelligence,Boosting (machine learning),Weight distribution,Machine learning
Journal
Volume
Issue
ISSN
53
2
0269-2821
Citations 
PageRank 
References 
1
0.34
22
Authors
2