Title | ||
---|---|---|
Low-Discrepancy Points for Deterministic Assignment of Hidden Weights in Extreme Learning Machines. |
Abstract | ||
---|---|---|
The traditional extreme learning machine (ELM) approach is based on a random assignment of the hidden weight values, while the linear coefficients of the output layer are determined analytically. This brief presents an analysis based on geometric properties of the sampling points used to assign the weight values, investigating the replacement of random generation of such values with low-discrepancy sequences (LDSs). Such sequences are a family of sampling methods commonly employed for numerical integration, yielding a more efficient covering of multidimensional sets with respect to random sequences, without the need for any computationally intensive procedure. In particular, we prove that the universal approximation property of the ELM is guaranteed when LDSs are employed, and how an efficient covering affects the convergence positively. Furthermore, since LDSs are generated deterministically, the results do not have a probabilistic nature. Simulation results confirm, in practice, the good theoretical properties given by the combination of ELM with LDSs. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1109/TNNLS.2015.2424999 | IEEE Trans. Neural Netw. Learning Syst. |
Keywords | Field | DocType |
discrepancy,extreme learning machines (elms),low-discrepancy sequences (ldss),universal approximation. | Convergence (routing),Extreme learning machine,Computer science,Random assignment,Numerical integration,Sampling (statistics),Artificial intelligence,Probabilistic logic,Approximation property,Machine learning | Journal |
Volume | Issue | ISSN |
PP | 99 | 2162-2388 |
Citations | PageRank | References |
7 | 0.50 | 13 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Cristiano Cervellera | 1 | 226 | 23.63 |
Danilo Macciò | 2 | 64 | 10.95 |