Abstract | ||
---|---|---|
An efficient training and pruning method based on H∞ filtering algorithm is proposed for Feedforward neural networks (FNNs). A FNNs’ weight importance measure linking up prediction error sensitivity obtained from H∞ filtering training and a weight salience based pruning technique are derived. The results of extensive experimentation indicate that the proposed method provides better pruning results during the training process of the network without losing its generalization capacity, also provides a robust global optimization training algorithm for given arbitrary network structures. |
Year | DOI | Venue |
---|---|---|
2006 | 10.1007/11759966_77 | ISNN (1) |
Keywords | Field | DocType |
arbitrary network structure,feedforward neural networks training,pruning method,pruning technique,better pruning result,efficient training,robust global optimization training,weight importance measure,feedforward neural network,training process | H-infinity methods in control theory,Feedforward neural network,Pattern recognition,Computer science,Filter (signal processing),Probabilistic neural network,Time delay neural network,Artificial intelligence,Machine learning,Pruning | Conference |
Volume | Issue | ISSN |
3971 LNCS | null | 16113349 |
ISBN | Citations | PageRank |
3-540-34439-X | 0 | 0.34 |
References | Authors | |
11 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
He-Sheng Tang | 1 | 1 | 1.72 |
Song-Tao Xue | 2 | 1 | 3.41 |
Rong Chen | 3 | 0 | 0.34 |