Title
A fast and parsimonious fuzzy neural network (FPFNN) for function approximation
Abstract
In this paper, a novel online self-constructing approach, named fast and parsimonious fuzzy neural network (FPFNN), which emerges the pruning strategy into the growing criteria, is proposed for a function approximator. The restrained growth not only speed up the online learning process but also build a more parsimonious fuzzy neural network while comparable performance and accuracy can be obtained since the growth criterion features characteristics of growing and pruning. The FPFNN starts with no hidden neuron and parsimoniously generates new hidden units according to the restrictive growing criteria as learning proceeds. As the second learning phase, the free parameters of hidden units, regardless of newly created or originally existing, are updated by extended Kalman filter (EKF) method. The performance of FPFNN algorithm is compared with other typical algorithms like RANEKF, MRAN, DFNN and GDFNN, etc., in function approximation. The simulation results demonstrate that the proposed FPFNN algorithm can provide more fast learning speed and more compact network structure with comparable generalization performance and accuracy.
Year
DOI
Venue
2009
10.1109/CDC.2009.5400146
CDC
Keywords
Field
DocType
self-adjusting systems,kalman filters,pruning strategy,online self-constructing,online learning,function approximation,fpfnn algorithm,extended kalman filter,fast and parsimonious fuzzy neural network,growth criterion,fuzzy neural nets,nonlinear filters,approximation algorithms,data mining,fuzzy neural network
Approximation algorithm,Extended Kalman filter,Function approximation,Computer science,Kalman filter,Artificial intelligence,Artificial neural network,Speedup,Free parameter,Network structure
Conference
ISSN
ISBN
Citations 
0191-2216 E-ISBN : 978-1-4244-3872-3
978-1-4244-3872-3
1
PageRank 
References 
Authors
0.36
15
3
Name
Order
Citations
PageRank
Ning Wang120218.93
Xianyao Meng21144.30
Qingyang Xu3152.85