Abstract | ||
---|---|---|
By introducing an adaptive error function, a balanced ensemble learning had been developed from negative correlation learning. In this paper, balanced ensemble learning had been used to train a set of small neural networks with one hidden node only. The experimental results suggest that balanced ensemble learning is able to create a strong ensemble by combining a set of weak learners. Different to bagging and boosting where learners are trained on randomly re-sampled data from the original set of patterns, learners could be trained on all available data in balanced ensemble learning. It is interesting to be discovered that learners by balanced ensemble learning could be just be slightly better than random guessing even if they had been trained on the whole data set. Another difference among these ensemble learning methods is that learners are trained simultaneously in balanced ensemble learning when learners are trained independently in bagging, and sequentially in boosting. |
Year | DOI | Venue |
---|---|---|
2009 | 10.1007/978-3-642-04843-2_18 | ISICA |
Keywords | Field | DocType |
strong ensemble,adaptive error function,available data,small neural networks,whole data,original set,balanced ensemble,balanced learning,re-sampled data,negative correlation learning,balanced ensemble learning,neural network,ensemble learning | Error function,Negative correlation,Semi-supervised learning,Computer science,Artificial intelligence,Boosting (machine learning),Artificial neural network,Ensemble learning,Hidden node problem,Machine learning | Conference |
Volume | ISSN | Citations |
5821 | 0302-9743 | 5 |
PageRank | References | Authors |
0.67 | 9 | 1 |