Abstract | ||
---|---|---|
Boosting is one of the most significant development in machine learning areas in recent years. Although boosting has already achieved great success in practical applications, its internal mechanism has not been entirely understood. In this paper, we present a new perspective to design boosting algorithms: extracting independent weak rules. A boosting algorithm can be divided into two parts, an extractor and a combiner. We first introduce the concept of independency into boosting. Our target is to use an extractor to generate a sequence of high-accuracy weak rules that are mutually independent on the original data distribution, then use a combiner to merge these independent rules into a strong classifier. In order to design such a boosting algorithm, we introduce an assumption based on the essence of weak learners. In this perspective, the mechanism of AdaBoost can be interpreted very naturally, and a criterion evaluating whether a weak learner is suitable to be used for boosting is proposed. A series of experiments are conducted on real datasets to verify the theoretical conclusions we derived in this paper. Copyright © 2009, authors listed above. All rights reserved. |
Year | DOI | Venue |
---|---|---|
2010 | null | ISAIM |
Keywords | Field | DocType |
adaboost,boosting,independent rules,machine learning | AdaBoost,Pattern recognition,Boosting (machine learning),Extractor,Artificial intelligence,Classifier (linguistics),Merge (version control),Independence (probability theory),Machine learning,BrownBoost,Mathematics,Gradient boosting | Conference |
Volume | Issue | Citations |
null | null | 0 |
PageRank | References | Authors |
0.34 | 4 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yuchen Zhang | 1 | 660 | 36.47 |
Li Zhang | 2 | 41 | 10.80 |