Title
Modified Gram-Schmidt Algorithm for Extreme Learning Machine
Abstract
Extreme learning machine (ELM) has shown to be extremely fast with better generalization performance. The basic idea of ELM algorithm is to randomly choose the parameters of hidden nodes and then use simple generalized inverse operation to solve for the output weights of the network. Such a procedure faces two problems. First, ELM tends to require more random hidden nodes than conventional tuning-based algorithms. Second, subjectivity is involved in choosing appropriate number of random hidden nodes. In this paper, we propose an enhanced-ELM(en-ELM) algorithm by applying the modified Gram-Schmidt (MGS) method to select hidden nodes in random hidden nodes pool. Furthermore, enhanced-ELM uses the Akaike's final prediction error (FPE) criterion to automatically determine the number of random hidden nodes. In comparison with conventional ELM learning method on several commonly used regressor benchmark problems, enhanced-ELM algorithm can achieve compact network with much faster response and satisfactory accuracy.
Year
DOI
Venue
2009
10.1109/ISCID.2009.275
ISCID (2)
Keywords
Field
DocType
random hidden node,appropriate number,modified gram-schmidt algorithm,enhanced-elm algorithm,conventional tuning-based algorithm,hidden node,conventional elm,extreme learning machine,compact network,elm algorithm,random hidden nodes pool,inverse problems,regression analysis,feedforward neural network,learning artificial intelligence,subjectivity,feedforward neural networks,data mining,computational intelligence,generalized inverse
Extreme learning machine,Computer science,Generalized inverse,Inverse problem,Artificial intelligence,Gram schmidt,Feedforward neural network,Mean squared prediction error,Akaike information criterion,Pattern recognition,Computational intelligence,Algorithm,Machine learning
Conference
Citations 
PageRank 
References 
1
0.37
5
Authors
3
Name
Order
Citations
PageRank
Jian-Chuan Yin17612.14
Fang Dong210.37
Ni-Ni Wang3154.12