Title
An Analysis of Boosted Linear Classifiers on Noisy Data with Applications to Multiple-Instance Learning
Abstract
An interesting observation about the well-known AdaBoost algorithm is that, though theory suggests it should overfit when applied to noisy data, experiments indicate it often does not do so in practice. In this paper, we study the behavior of AdaBoost on datasets with one-sided uniform class noise using linear classifiers as the base learner. We show analytically that, under some ideal conditions, this approach will not overfit, and can in fact recover a zero-error concept with respect to the true, uncorrupted instance labels. We also analytically show that AdaBoost increases the margins of predictions over boosting iterations, as has been previously suggested in the literature. We then compare the empirical behavior of AdaBoost using real world datasets with one-sided noise derived from multiple-instance data. Although our assumptions may not hold in a practical setting, our experiments show that standard AdaBoost still performs well, as suggested by our analysis, and often outperforms baseline variations in the literature that explicitly try to account for noise.
Year
DOI
Venue
2017
10.1109/ICDM.2017.38
2017 IEEE International Conference on Data Mining (ICDM)
Keywords
Field
DocType
AdaBoost,Noise,Multiple-Instance Learning
Adaboost algorithm,Noisy data,AdaBoost,Noise measurement,Computer science,Artificial intelligence,Boosting (machine learning),Overfitting,Machine learning
Conference
ISSN
ISBN
Citations 
1550-4786
978-1-5386-2449-4
0
PageRank 
References 
Authors
0.34
21
2
Name
Order
Citations
PageRank
Rui Liu101.01
Soumya Ray2948.89