Title
Neural networks for approximation of real functions with the Gaussian functions
Abstract
We present a type of single-hidden layer feedforward neural networks with the Gaussian activation function. First, we give a new and quantitative proof of the fact that a single layer neural networks with n + 1 hidden neurons can learn n + 1 distinct samples with zero error. Then we give approximate interpolants. They can approximate interpolate, with arbitrary precision, any set of distinct data in one or several dimensions. They can uniformly approximate any continuous function of one variable. © 2007 IEEE.
Year
DOI
Venue
2007
10.1109/ICNC.2007.498
ICNC
Keywords
Field
DocType
activation function,feedforward neural network,function approximation,neural network,gaussian processes
Universal approximation theorem,Feedforward neural network,Mathematical optimization,Rectifier (neural networks),Activation function,Recurrent neural network,Types of artificial neural networks,Artificial intelligence,Gaussian process,Artificial neural network,Mathematics,Machine learning
Conference
Volume
Issue
ISSN
1
null
null
ISBN
Citations 
PageRank 
0-7695-2875-9
3
0.41
References 
Authors
5
2
Name
Order
Citations
PageRank
Xuli Han115922.91
Muzhou Hou2504.49