Title
Is extreme learning machine feasible? A theoretical assessment (part I).
Abstract
An extreme learning machine (ELM) is a feedforward neural network (FNN) like learning system whose connections with output neurons are adjustable, while the connections with and within hidden neurons are randomly fixed. Numerous applications have demonstrated the feasibility and high efficiency of ELM-like systems. It has, however, been open if this is true for any general applications. In this two-part paper, we conduct a comprehensive feasibility analysis of ELM. In Part I, we provide an answer to the question by theoretically justifying the following: 1) for some suitable activation functions, such as polynomials, Nadaraya-Watson and sigmoid functions, the ELM-like systems can attain the theoretical generalization bound of the FNNs with all connections adjusted, i.e., they do not degrade the generalization capability of the FNNs even when the connections with and within hidden neurons are randomly fixed; 2) the number of hidden neurons needed for an ELM-like system to achieve the theoretical bound can be estimated; and 3) whenever the activation function is taken as polynomial, the deduced hidden layer output matrix is of full column-rank, therefore the generalized inverse technique can be efficiently applied to yield the solution of an ELM-like system, and, furthermore, for the nonpolynomial case, the Tikhonov regularization can be applied to guarantee the weak regularity while not sacrificing the generalization capability. In Part II, however, we reveal a different aspect of the feasibility of ELM: there also exists some activation functions, which makes the corresponding ELM degrade the generalization capability. The obtained results underlie the feasibility and efficiency of ELM-like systems, and yield various generalizations and improvements of the systems as well.
Year
DOI
Venue
2015
10.1109/TNNLS.2014.2335212
IEEE Trans. Neural Netw. Learning Syst.
Keywords
DocType
Volume
neural networks.,neural networks,generalized inverse technique,transfer functions,learning (artificial intelligence),tikhonov regularization,full column-rank,generalization capability,hidden layer output matrix,feedforward neural nets,elm-like systems,extreme learning machine (elm),feedforward neural network,hidden neurons,activation functions,nadaraya-watson functions,extreme learning machine,generalisation (artificial intelligence),fnn,feasibility,sigmoid functions,learning system
Journal
26
Issue
ISSN
Citations 
1
2162-2388
6
PageRank 
References 
Authors
0.49
0
4
Name
Order
Citations
PageRank
Xia Liu1662.18
Shaobo Lin218420.02
jian fang3743.82
Zongben Xu43203198.88