Title | ||
---|---|---|
Deep convolutional extreme learning machines: Filters combination and error model validation. |
Abstract | ||
---|---|---|
•A disadvantage of convolutional networks is the long training time: it can take days to adjust weights with iterative methods based on gradient descent, an obstacle in real time applications.•Fast convolutional networks avoid gradient-based methods by efficiently defining filters in extraction.•We propose convolutional extreme learning machines: fast convolutive neural networks based on extreme learning machines and a fixed bank of filters.•Our model does not employ gradient-descent methods.•We demonstrate that our model is feasible to be used in non-expensive non-specialized computer hardware: it performs training tasks fasters using CPU than models based on GPUs.•Results were generated using MNIST and EMNIST digits databases. Only two convolutional stages were sufficient to achieve a low error.•We find that the experimental error agrees with the error postulated by Rahimi–Retch’s theorem.•The proposed network resulted in superior accuracy as well as competitive training time, even in relation to approaches that employ processing in GPUs. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1016/j.neucom.2018.10.063 | Neurocomputing |
Keywords | Field | DocType |
Convolutional extreme learning machine,Convolutional filter combination,Rahimi–Retch generalization model,Deep learning,Fast machine learning algorithms,Digit recognition | Obstacle,Gradient descent,Pattern recognition,Convolutional neural network,Extreme learning machine,Iterative method,Feature extraction,Artificial intelligence,Backpropagation,Machine learning,Mathematics,Cognitive neuroscience of visual object recognition | Journal |
Volume | ISSN | Citations |
329 | 0925-2312 | 0 |
PageRank | References | Authors |
0.34 | 37 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Michel M. dos Santos | 1 | 0 | 0.68 |
Abel G. Filho | 2 | 1 | 1.73 |
Wellington P. dos Santos | 3 | 36 | 11.00 |