Abstract | ||
---|---|---|
Considering computational algorithms available in the literature, associated with supervised learning in feedforward neural networks, a wide range of distinct approaches can be identified. While the adjustment of the connection weights represents an omnipresent stage, the algorithms differ on three basic aspects: the technique chosen to determine the dimension of the multilayer neural network, the procedure adopted to specify the activation functions, and the kind of composition used to produce the output. Advanced learning algorithms should be developed to simultaneously treat all these aspects during learning, and an evolutionary learning algorithm with local search is proposed here. The essence of this approach is a synergy between genetic algorithms and conjugate gradient optimization, operating on a hybrid neural network architecture. As a consequence, the final neural network is automatically generated, and is characterized to be dedicated and computationally parsimonious. |
Year | Venue | Keywords |
---|---|---|
2002 | Integrated Computer-Aided Engineering | feedforward neural network,evolutionary learning algorithm,multilayer neural network,computational algorithm,activation function,hybrid neural network architecture,basic aspect,supervised learning,final neural network,advanced learning algorithm,evolutionary approach,local search |
Field | DocType | Volume |
Feedforward neural network,Computer science,Recurrent neural network,Hybrid neural network,Probabilistic neural network,Types of artificial neural networks,Time delay neural network,Artificial intelligence,Deep learning,Artificial neural network,Machine learning | Journal | 9 |
Issue | ISSN | Citations |
1 | 1069-2509 | 6 |
PageRank | References | Authors |
0.73 | 0 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Eduardo Masato Iyoda | 1 | 30 | 4.14 |
Fernando J. Von Zuben | 2 | 831 | 81.83 |