Title
In-memory area-efficient signal streaming processor design for binary neural networks
Abstract
The expanding use of deep learning algorithms causes the demands for accelerating neural network (NN) signal processing. For the NN processing, in-memory computation is desired, in which expensive data transfer can be eliminated. In reflection of recently proposed binary neural networks (BNNs), which can reduce the computation resource and area requirements, we designed an in-memory BNN signal processor that densely stores binary weights in on-chip memories and can scale linearly with serial-parallel-serial signal stream. It achieved 3 and 71 times better per-power and per-area performance than an existing in-memory neuromorphic processor.
Year
DOI
Venue
2017
10.1109/MWSCAS.2017.8052874
2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS)
Keywords
Field
DocType
binary neural networks,deep learning algorithms,neural network signal processing,in-memory computation,in-memory BNN signal processor,on-chip memories,serial-parallel-serial signal stream,in-memory area-efficient signal streaming processor design
Signal processing,Data transmission,Digital signal processor,Computer science,Neuromorphic engineering,Electronic engineering,Time delay neural network,Artificial intelligence,Deep learning,Computer hardware,Artificial neural network,Parallel computing,Processor design
Conference
ISBN
Citations 
PageRank 
978-1-5090-6390-1
1
0.38
References 
Authors
7
11
Name
Order
Citations
PageRank
Haruyoshi Yonekawa1344.37
Shimpei Sato2122.94
Hiroki Nakahara315537.34
Kota Ando4246.81
Kodai Ueyoshi5223.84
Kazutoshi Hirose652.94
Kentaro Orimo7161.57
Shinya Takamaeda-Yamazaki86516.83
M. Ikebe94713.63
Tetsuya Asai1012126.53
Masato Motomura119127.81