Title
Improving generalization capability of neural networks based on simulated annealing
Abstract
This paper presents a single-objective and a multiobjective stochastic optimization algorithms for global training of neural networks based on simulated annealing. The algorithms overcome the limitation of local optimization by the conventional gradient-based training methods and perform global optimization of the weights of the neural networks. Especially, the multiobjective training algorithm is designed to enhance generalization capability of the trained networks by minimizing the training error and the dynamic range of the network weights simultaneously. For fast convergence and good solution quality of the algorithms, we suggest the hybrid simulated annealing algorithm with the gradient-based local optimization method. Experimental results show that the per- formance of the trained networks by the proposed methods is better than that by the gradient-based local training algorithm and, moreover, the generalization capability of the networks is significantly improved by preventing overfitting phenomena.
Year
DOI
Venue
2007
10.1109/CEC.2007.4424918
IEEE Congress on Evolutionary Computation
Keywords
Field
DocType
global optimization,neural network,stochastic processes,stochastic optimization,simulated annealing,neural nets,convergence,dynamic range
Convergence (routing),Simulated annealing,Stochastic optimization,Mathematical optimization,Global optimization,Computer science,Stochastic neural network,Artificial intelligence,Overfitting,Local search (optimization),Artificial neural network,Machine learning
Conference
Citations 
PageRank 
References 
4
0.42
8
Authors
4
Name
Order
Citations
PageRank
Yeejin Lee140.42
Jong-Seok Lee282761.06
Sun-Young Lee39612.49
cheol hoon417830.78