Abstract | ||
---|---|---|
Neural network ensemble is a learning paradigm where several neural networks are jointly used to solve a problem. In this paper, the relationship between the generalization ability of the neural network ensemble and the correlation of the individual neural networks is analyzed, which reveals that ensembling a selective subset of individual networks is superior to ensembling all the individual networks in some cases. Therefore an approach named GASEN is proposed, which trains several individual neural networks and then employs genetic algorithm to select an optimum subset of individual networks to constitute an ensemble. Experimental results show that, comparing with a popular ensemble approach, i.e. averaging all, and a theoretically optimum selective ensemble approach, i.e. enumerating, GASEN has preferable performance in generating ensembles with strong generalization ability in relatively small computational cost. |
Year | Venue | Keywords |
---|---|---|
2001 | IJCAI | optimum subset,generalization ability,individual network,neural network,genetic algorithm,selective subset,strong generalization ability,popular ensemble approach,theoretically optimum selective ensemble,neural network ensemble,individual neural network,selective neural network ensemble |
Field | DocType | ISBN |
Computer science,Time delay neural network,Correlation,Artificial intelligence,Artificial neural network,Ensemble learning,Genetic algorithm,Machine learning | Conference | 1-55860-812-5 |
Citations | PageRank | References |
44 | 1.74 | 11 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zhi-Hua Zhou | 1 | 13480 | 569.92 |
Jianxin Wu | 2 | 3276 | 154.17 |
Yuan Jiang | 3 | 714 | 53.61 |
Shifu Chen | 4 | 430 | 38.48 |