Abstract | ||
---|---|---|
For three kinds of neural networks constructed by Suzuki [Constructive function approximation by three layer artificial neural networks, Neural Networks 11 (1998) 1049-1058], by establishing both upper and lower bound estimations on approximation order, the essential approximation order of these networks is estimated and the theorem of saturation (the largest capacity of approximation) is proved. These results can precisely characterize the approximation ability of these networks and clarify the relationship among the rate of approximation, the number of hidden-layer units and the properties of approximated functions. Our paper extends and perfects the error estimations of Suzuki [Constructive function approximation by three layer artificial neural networks, Neural Networks 11 (1998) 1049-1058]. On the basis of the numerical example, we can conclude that the accuracy of our estimations outperform to the Suzuki's results. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1016/j.neucom.2007.11.004 | Neurocomputing |
Keywords | Field | DocType |
function approximation,upper and lower bounds,artificial neural network,neural network | Universal approximation theorem,Function approximation,Upper and lower bounds,Constructive,Minimax approximation algorithm,Modulus of smoothness,Artificial intelligence,Artificial neural network,Mathematics,Machine learning,Approximation error | Journal |
Volume | Issue | ISSN |
71 | 16-18 | 0925-2312 |
Citations | PageRank | References |
1 | 0.35 | 7 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Fengjun Li | 1 | 233 | 23.55 |
Zongben Xu | 2 | 3203 | 198.88 |
Yue-Ting Zhou | 3 | 1 | 0.35 |