Abstract | ||
---|---|---|
Unlike the first and the second generation artificial neural networks, spiking neural networks (SNNs) model the human brain by incorporating not only synaptic state but also a temporal component into their operating model. However, their intrinsic properties require expensive computation during training. This paper presents a novel algorithm to SpikeProp for SNN by introducing smoothing L1∕2 regularization term into the error function. This algorithm makes the network structure sparse, with some smaller weights that can be eventually removed. Meanwhile, the convergence of this algorithm is proved under some reasonable conditions. The proposed algorithms have been tested for the convergence speed, the convergence rate and the generalization on the classical XOR-problem, Iris problem and Wisconsin Breast Cancer classification. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1016/j.neunet.2018.03.007 | Neural Networks |
Keywords | Field | DocType |
Spiking neural networks,SpikeProp,Smoothing L1∕2 regularization,Convergence,Sparsity | Convergence (routing),Error function,Algorithm,Smoothing,Regularization (mathematics),Rate of convergence,Spiking neural network,Artificial neural network,Mathematics,Computation | Journal |
Volume | Issue | ISSN |
103 | 1 | 0893-6080 |
Citations | PageRank | References |
3 | 0.38 | 10 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Junhong Zhao | 1 | 27 | 7.02 |
Jacek M. Zurada | 2 | 2553 | 226.22 |
Jie Yang | 3 | 66 | 8.63 |
Wei Wu | 4 | 102 | 9.54 |