Title
Compiling Spiking Neural Networks to Mitigate Neuromorphic Hardware Constraints
Abstract
Spiking Neural Networks (SNNs) are efficient computation models to perform spatio-temporal pattern recognition on resource- and power-constrained platforms. SNNs executed on neuromorphic hardware can further reduce energy consumption of these platforms. With increasing model size and complexity, mapping SNN-based applications to tile-based neuromorphic hardware is becoming increasingly challenging. This is attributed to the limitations of neuro-synaptic cores, viz. a crossbar, to accommodate only a fixed number of pre-synaptic connections per post-synaptic neuron. For complex SNN-based models that have many neurons and pre-synaptic connections per neuron, (1) connections may need to be pruned after training to fit onto the crossbar resources, leading to a loss in model quality, e.g., accuracy, and (2) the neurons and synapses need to be partitioned and placed on the neuro-sypatic cores of the hardware, which could lead to increased latency and energy consumption. In this work, we propose (1) a novel unrolling technique that decomposes a neuron function with many pre-synaptic connections into a sequence of homogeneous neural units to significantly improve the crossbar utilization and retain all pre-synaptic connections, and (2) SpiNeMap, a novel methodology to map SNNs on neuromorphic hardware with an aim to minimize energy consumption and spike latency.
Year
DOI
Venue
2020
10.1109/IGSC51522.2020.9290830
2020 11th International Green and Sustainable Computing Workshops (IGSC)
Keywords
DocType
ISBN
Neuromorphic Computing,Spiking Neural Networks (SNNs),Machine Learning,Computation Graph
Conference
978-1-6654-1553-8
Citations 
PageRank 
References 
0
0.34
20
Authors
2
Name
Order
Citations
PageRank
Adarsha Balaji1154.27
Anup Das 0001236733.35