Title
Build Correlation Awareness In Negative Correlation Learning
Abstract
This paper proposed to implement negative correlation learning (NCL) in optimizing the two different learning functions on the two separated subsets without overlapping. Because the two subsets could be randomly generated for each individual neural network (NN), they would be different for every pair of individual NNs in an neural network ensemble (NNE). When the two learning functions in NCL could be optimized separately, each individual NN could avoid the conflicts in learning by always having the unique learning direction on a given data sample. Therefore, each individual NN is clearly aware of its own learning direction on every training data. Such selfawareness is essential to create a set of cooperative NNs for an NNE. Experimental results show that the individual NNs by NCL with such separate learning could remain the difference, and have the stable performance even in the longer training process.
Year
Venue
Field
2017
2017 13TH INTERNATIONAL CONFERENCE ON NATURAL COMPUTATION, FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY (ICNC-FSKD)
Training set,Negative correlation,Sample (statistics),Computer science,Correlation,Artificial intelligence,Artificial neural network,Machine learning
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
1
Name
Order
Citations
PageRank
Yong Liu12526265.08