Title
Going Deeper in Spiking Neural Networks: VGG and Residual Architectures.
Abstract
Over the past few years, Spiking Neural Networks (SNNs) have become popular as a possible pathway to enable low-power event-driven neuromorphic hardware. However, their application in machine learning have largely been limited to very shallow neural network architectures for simple problems. In this paper, we propose a novel algorithmic technique for generating an SNN with a deep architecture, and demonstrate its effectiveness on complex visual recognition problems such as CIFAR-10 and ImageNet. Our technique applies to both VGG and Residual network architectures, with significantly better accuracy than the state-of-the-art. Finally, we present analysis of the sparse event-driven computations to demonstrate reduced hardware overhead when operating in the spiking domain.
Year
DOI
Venue
2018
10.3389/fnins.2019.00095
FRONTIERS IN NEUROSCIENCE
Keywords
Field
DocType
spiking neural networks,event-driven neural networks,sparsity,neuromorphic computing,visual recognition
Neuromorphic hardware,Residual,Computer architecture,Computer science,Network architecture,Visual recognition,Artificial intelligence,Spiking neural network,Artificial neural network,Machine learning,Computation
Journal
Volume
Citations 
PageRank 
13
29
1.02
References 
Authors
18
5
Name
Order
Citations
PageRank
Abhronil Sengupta122923.08
Yuting Ye217910.18
Robert Y. Wang354426.88
Chiao Liu4291.02
Kaushik Roy57093822.19