Title
Neural networks for optimal approximation of smooth and analytic functions
Abstract
We prove that neural networks with a single hidden layer are capable of providing an optimal order of approximation for functions assumed to possess a given number of derivatives, if the activation function evaluated by each principal element satisfies certain technical conditions. Under these conditions, it is also possible to construct networks that provide a geometric order of approximation for analytic target functions. The permissible activation functions include the squashing function (1 − e−x)−1 as well as a variety of radial basis functions. Our proofs are constructive. The weights and thresholds of our networks are chosen independently of the target function; we give explicit formulas for the coefficients as simple, continuous, linear functionals of the target function.
Year
DOI
Venue
1996
10.1162/neco.1996.8.1.164
Neural Computation
Keywords
Field
DocType
certain technical condition,neural network,explicit formula,analytic function,radial basis function,analytic target function,permissible activation function,squashing function,optimal approximation,target function,activation function,optimal order,geometric order,satisfiability
Universal approximation theorem,Radial basis function network,Mathematical optimization,Radial basis function,Mathematical analysis,Activation function,Analytic function,Non-analytic smooth function,Complex-valued function,Smoothness,Mathematics
Journal
Volume
Issue
ISSN
8
1
0899-7667
Citations 
PageRank 
References 
61
4.82
10
Authors
1
Name
Order
Citations
PageRank
Hrushikesh Narhar Mhaskar125761.07