Title
Random Separation Learning For Neural Network Ensembles
Abstract
In order to prevent the individual neural networks from becoming similar in the long learning period of negative correlation learning for designing neural network ensembles, two approaches were adopted in this paper. The first approach is to replace large neural networks with small neural networks in neural network ensembles. Samll neural networks would be more practical in the real applications when the capability is limited. The second approach is to introduce random separation learning in negative correlation learning for each small neural network. The idea of random separation learning is to let each individual neural network learn differently on the randomly separated subsets of the given training samples. It has been found that the small neural networks could easily become weak and different each other by negative correlation learning with random separation learning. After applying large number of small neural networks for neural network ensembles, two combination methods were used to generate the output of the neural network ensembles while their performance had been compared.
Year
Venue
Field
2017
2017 10TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING, BIOMEDICAL ENGINEERING AND INFORMATICS (CISP-BMEI)
Training set,Negative correlation,Pattern recognition,Computer science,Correlation,Artificial intelligence,Artificial neural network
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
1
Name
Order
Citations
PageRank
Yong Liu12526265.08