Title
Local negative correlation with resampling
Abstract
This paper deals with a learning algorithm which combines two well known methods to generate ensemble diversity – error negative correlation and resampling. In this algorithm, a set of learners iteratively and synchronously improve their state considering information about the performance of a fixed number of other learners in the ensemble, to generate a sort of local negative correlation. Resampling allows the base algorithm to control the impact of highly influential data points which in turns can improve its generalization error. The resulting algorithm can be viewed as a generalization of bagging, where each learner no longer is independent but can be locally coupled with other learners. We will demonstrate our technique on two real data sets using neural networks ensembles.
Year
DOI
Venue
2006
10.1007/11875581_69
IDEAL
Keywords
Field
DocType
base algorithm,ensemble diversity,resulting algorithm,influential data point,neural networks ensemble,local negative correlation,generalization error,learners iteratively,error negative correlation
Data set,Data transmission,Computer science,Artificial intelligence,Artificial neural network,Ensemble learning,Resampling,Data point,Pattern recognition,sort,Algorithm,Correlation,Machine learning
Conference
Volume
ISSN
ISBN
4224
0302-9743
3-540-45485-3
Citations 
PageRank 
References 
2
0.38
10
Authors
4
Name
Order
Citations
PageRank
Ricardo Ñanculef15310.64
Carlos Valle2218.20
Héctor Allende314831.69
Claudio Moraga4612100.27