Title | ||
---|---|---|
A study on the randomness reduction effect of extreme learning machine with ridge regression |
Abstract | ||
---|---|---|
In recent years, Extreme Learning Machine (ELM) has attracted comprehensive attentions as a universal function approximator. Comparing to other single layer feedforward neural networks, its input parameters of hidden neurons can be randomly generated rather than tuned, and thereby saving a huge amount of computational power. However, it has been pointed out that the randomness of ELM parameters would result in fluctuating performances. In this paper, we intensively investigate the randomness reduction effect by using a regularized version of ELM, named Ridge ELM (RELM). Previously, RELM has been shown to achieve generally better generalization than the original ELM. Furthermore, we try to demonstrate that RELM can also greatly reduce the fluctuating performance with 12 real world regression tasks. An insight into this randomness reduction effect is also given. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1007/978-3-642-39065-4_21 | ISNN (1) |
Keywords | Field | DocType |
better generalization,randomness reduction effect,ridge elm,hidden neuron,elm parameter,extreme learning machine,original elm,computational power,ridge regression,fluctuating performance,comprehensive attention | Feedforward neural network,Regression,Computer science,Extreme learning machine,Ridge,Artificial intelligence,Machine learning,Randomness | Conference |
Citations | PageRank | References |
3 | 0.38 | 15 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
J. Meng | 1 | 2793 | 174.51 |
Zhifei Shao | 2 | 62 | 4.97 |
Ning Wang | 3 | 333 | 18.88 |