Title
Energy efficient parallel neuromorphic architectures with approximate arithmetic on FPGA.
Abstract
In this paper, we present the parallel neuromorphic processor architectures for spiking neural networks on FPGA. The proposed architectures address several critical issues pertaining to efficient parallelization of the update of membrane potentials, on-chip storage of synaptic weights and integration of approximate arithmetic units. The trade-offs between throughput, hardware cost and power overheads for different configurations are thoroughly investigated. Notably, for the application of handwritten digit recognition, a promising training speedup of 13.5x and a recognition speedup of 25.8x are achieved by a parallel implementation whose degree of parallelism is 32. In spite of the 120MHz operating frequency, the 32-way parallel hardware design demonstrates a 59.4x training speedup over the single-thread software program running on a 2.2GHz general purpose CPU. Equally importantly, by leveraging the built-in resilience of the neuromorphic architecture we demonstrate the energy benefit resulted from the use of approximate arithmetic computation. Up to 20% improvement in energy consumption is achieved by integrating approximate multipliers into the system while maintaining almost the same level of recognition rate achieved using standard multipliers. To the best of our knowledge, it is the first time that the approximate computing and parallel processing are applied to FPGA based spiking neural networks. The influence of the parallel processing on the benefits of approximate computing is also discussed in detail.
Year
DOI
Venue
2017
10.1016/j.neucom.2016.09.071
Neurocomputing
Keywords
Field
DocType
Spiking neural network,Neuromorphic system,Approximate computing,Parallel architecture
Computer science,Efficient energy use,Degree of parallelism,Neuromorphic engineering,Field-programmable gate array,Arithmetic,Artificial intelligence,Throughput,Spiking neural network,Energy consumption,Machine learning,Speedup
Journal
Volume
Issue
ISSN
221
C
0925-2312
Citations 
PageRank 
References 
13
0.58
20
Authors
5
Name
Order
Citations
PageRank
Qian Wang1392.68
Youjie Li2403.26
Botang Shao3231.61
Siddhartha Dey4130.58
Peng Li51912152.85