Title
In-situ learning in multilayer locally-connected memristive spiking neural network
Abstract
Memristive spiking neural networks (MSNNs) have great potential to process information with higher efficiency and lower time latency than conventional artificial neural networks (ANNs). However, MSNNs still lack effective hardware-based training algorithms to achieve comparable performance to the mature ANNs. Therefore, a multilayer locally-connected (LC) MSNN is proposed to realize high performance with self-adaptive and in-situ learning. In the LC-MSNN, spatial and temporal interactions are introduced to activate hidden neurons spontaneously; synaptic weights are updated locally with spike-time-dependent plasticity (STDP) by pulse scheme including processing and updating phases; nonlinear conductance response (CR) is utilized to realize the adjustive learning rate. The LC-MSNN is comprehensively verified and benchmarked with the MNIST dataset. Moreover, self-adaptive activations of the hidden neurons are investigated by extracting and visualizing their internal states and related features; the adjustive learning rate is studied in different nonlinear CR. Effects of non-idealities including finite resolution, device-to-device variation, and yield, are also taken into consideration in the LC-MSNN. Simulation results show the LC-MSNN has better performance (maximum recognition rate of 97.4%) and robustness to non-idealities. Therefore, this method is a hardware-friendly algorithm and can be applied to realize high-performance SNNs in a memristor-based hardware system.
Year
DOI
Venue
2021
10.1016/j.neucom.2021.08.011
Neurocomputing
Keywords
DocType
Volume
Memristor,SNNs,Locally-connected,Self-adaptive,In-situ,Adjustive learning rate,STDP
Journal
463
ISSN
Citations 
PageRank 
0925-2312
0
0.34
References 
Authors
0
7
Name
Order
Citations
PageRank
Jiwei Li122.40
Hui Xu2127.67
Shengyang Sun323.75
Zhiwei Li41315107.73
Qingjiang Li564.87
Haijun Liu624.43
Nan Li72828.52