Title
Design of adaptive and incremental feed-forward neural networks
Abstract
The concepts of minimizing weight sensitivity cost and training square-error are applied on a biased two-layered perceptron using gradient descent to obtain an adaptive learning mechanism. Experiments show that the adaptive learning mechanism can tolerate noisy and inconsistent training instances by localizing the responses of conflicting data. Methods of resampling and dynamic normalization are introduced to construct an incremental feedforward network (IFFN) based on adaptive learning. This incremental learning mechanism has a measurable generalization capability and satisfies almost all of the six criteria proposed for incremental learning
Year
DOI
Venue
1993
10.1109/ICNN.1993.298604
San Francisco, CA
Keywords
Field
DocType
feedforward neural nets,learning (artificial intelligence),adaptive learning mechanism,biased two-layered perceptron,conflicting data,dynamic normalization,feed-forward neural networks,generalization capability,gradient descent,incremental feedforward network,resampling,training square-error,weight sensitivity cost,neural networks,computer science,learning artificial intelligence,adaptive learning,satisfiability,prototypes,feed forward neural network,feedforward neural networks
Gradient descent,Feedforward neural network,Normalization (statistics),Computer science,Artificial intelligence,Artificial neural network,Adaptive learning,Perceptron,Resampling,Machine learning,Feed forward
Conference
Citations 
PageRank 
References 
1
0.43
5
Authors
2
Name
Order
Citations
PageRank
Hown-Wen Chen172.89
vonwun soo241656.84