Title
An oscillation bound of the generalization performance of extreme learning machine and corresponding analysis.
Abstract
Extreme Learning Machine (ELM), proposed by Huang et al. in 2004 for the first time, performs better than traditional learning machines such as BP networks and SVM in some applications. This paper attempts to give an oscillation bound of the generalization performance of ELM and a reason why ELM is not sensitive to the number of hidden nodes, which are essential open problems proposed by Huang et al. in 2011. The derivation of the bound is in the framework of statistical learning theory and under the assumption that the expectation of the ELM kernel exists. It turns out that our bound is consistent with the experimental results about ELM obtained before and predicts that overfitting can be avoided even when the number of hidden nodes approaches infinity. The prediction is confirmed by our experiments on 15 data sets using one kind of activation function with every parameter independently drawn from the same Guasssian distribution, which satisfies the assumption above. The experiments also showed that when the number of hidden nodes approaches infinity, the ELM kernel with the activation is insensitive to the kernel parameter.
Year
DOI
Venue
2015
10.1016/j.neucom.2014.10.006
Neurocomputing
Keywords
Field
DocType
Extreme learning machine,Oscillation bound,Generalization performance,Theoretical research,Infinite hidden nodes
Kernel (linear algebra),Statistical learning theory,Oscillation,Activation function,Extreme learning machine,Support vector machine,Infinity,Algorithm,Artificial intelligence,Overfitting,Mathematics,Machine learning
Journal
Volume
ISSN
Citations 
151
0925-2312
6
PageRank 
References 
Authors
0.41
34
3
Name
Order
Citations
PageRank
Di Wang11337143.48
Ping Wang21012.69
Yan Ji360.41