Title
Feedforward Neural Network Initialization: an Evolutionary Approach
Abstract
The initial set of weights to be used in supervised learning for multilayer neural networks has a strong influence in the learning speed and in the quality of the solution obtained after convergence. An inadequate initial choice of the weight values may cause the training process to get stuck in a poor local minimum or to face abnormal numerical problems. Nowadays, there are several proposed techniques that try to avoid both local minima and numerical instability, only by means of a proper definition of the initial set of weights. The focus of this paper is in the application of genetic algorithms (GA) as a tool to analyze the space of weights, in order to achieve good initial conditions for supervised learning. GA's almost-global sampling compliments connectionist local search techniques well, and allows us to find some very important characteristics in the initial set of weights for multilayer networks. The results presented are compared, for a set of benchmarks, with that produced by other approaches found in the literature.
Year
DOI
Venue
1998
10.1109/SBRN.1998.730992
SBRN
Keywords
Field
DocType
poor local minimum,inadequate initial choice,feedforward neural network initialization,good initial condition,multilayer neural network,initial set,local search technique,supervised learning,evolutionary approach,local minimum,multilayer network,abnormal numerical problem,feedforward neural network,local search,genetic algorithms,learning artificial intelligence,convergence,feedforward neural networks,initial condition,genetic algorithm,independent component analysis,neural networks,local minima,initialization,network topology
Feedforward neural network,Pattern recognition,Computer science,Stochastic neural network,Recurrent neural network,Supervised learning,Multilayer perceptron,Artificial intelligence,Deep learning,Artificial neural network,Machine learning,Catastrophic interference
Conference
ISBN
Citations 
PageRank 
0-8186-8629-4
1
0.37
References 
Authors
7
4