Title
An incremental learning method for neural networks based on sensitivity analysis
Abstract
The Sensitivity-Based Linear Learning Method (SBLLM) is a learning method for two-layer feedforward neural networks based on sensitivity analysis that calculates the weights by solving a linear system of equations. Therefore, there is an important saving in computational time which significantly enhances the behavior of this method as compared to other batch learning algorithms. The SBLLM works in batch mode; however, there exist several reasons that justify the need for an on-line version of this algorithm. Among them, it can be mentioned the need for real time learning for many environments in which the information is not available at the outset but rather, is continually acquired, or in those situations in which large databases have to be managed but the computing resources are limited. In this paper an incremental version of the SBLLM is presented. The theoretical basis for the method is given and its performance is illustrated by comparing the results obtained by the on-line and batch mode versions of the algorithm.
Year
DOI
Venue
2009
10.1007/978-3-642-14264-2_5
CAEPIA
Keywords
Field
DocType
neural network,batch mode,computational time,real time,important saving,batch mode version,computing resource,large databases,sensitivity analysis,sensitivity-based linear learning method,incremental version,on-line version,incremental learning method,feedforward neural network,linear system of equations
Competitive learning,Online machine learning,Feedforward neural network,System of linear equations,Computer science,Wake-sleep algorithm,Batch processing,Artificial intelligence,Artificial neural network,Population-based incremental learning,Machine learning
Conference
Volume
ISSN
ISBN
5988
0302-9743
3-642-14263-X
Citations 
PageRank 
References 
4
0.50
7
Authors
3
Name
Order
Citations
PageRank
Beatriz Pérez-Sánchez19514.03
Oscar Fontenla-Romero233739.49
Bertha Guijarro-Berdiñas329634.36