Abstract | ||
---|---|---|
Balanced ensemble learning was developed from negative correlation learning by shifting the learning targets. From the different learning behaviors in balance ensemble learning for the two structures of neural network ensembles on both low noisy data and high noisy data, a number of new discoveries are revealed in this paper. The first discovery is that the ensembles with small neural networks by balanced ensemble learning could perform as well as the ensembles with large neural networks by negative correlation learning. The second discovery is that there is seldom overfitting in balanced ensemble learning for the ensembles with small neural networks. In contrast, overfitting had been observed in balanced ensemble learning for the ensembles with large neural networks on both low noisy data and high noisy data. The third discovery is that both the large and the small mean squared errors could lead to overfitting. Overfitting rather than underfitting arising from the larger mean squared error might come out at a surprise. The explanations of such a rare phenomenon are presented in this paper. |
Year | DOI | Venue |
---|---|---|
2012 | 10.1109/IJCNN.2012.6252423 | IJCNN |
Keywords | Field | DocType |
mean squared errors,learning behaviors,negative correlation learning,learning (artificial intelligence),data analysis,low noisy data,balanced ensemble learning,learning targets,high noisy data,neural nets,neural network ensembles,mean square error methods,diabetes,learning artificial intelligence,correlation,noise measurement,neural networks | Competitive learning,Negative correlation,Noisy data,Pattern recognition,Noise measurement,Computer science,Mean squared error,Artificial intelligence,Overfitting,Artificial neural network,Ensemble learning,Machine learning | Conference |
ISSN | ISBN | Citations |
2161-4393 E-ISBN : 978-1-4673-1489-3 | 978-1-4673-1489-3 | 0 |
PageRank | References | Authors |
0.34 | 0 | 1 |