Title
Spatial-Temporal Hybrid Neural Network With Computing-in-Memory Architecture
Abstract
Deep learning (DL) has gained unprecedented success in many real-world applications. However, DL poses difficulties for efficient hardware implementation due to the needs of a complex gradient-based learning algorithm and the required high memory bandwidth for synaptic weight storage, especially in today’s data-intensive environment. Computing-in-memory (CIM) strategies have emerged as an alternat...
Year
DOI
Venue
2021
10.1109/TCSI.2021.3071956
IEEE Transactions on Circuits and Systems I: Regular Papers
Keywords
DocType
Volume
Reservoirs,Training,Hardware,Feature extraction,Memory management,Biological neural networks,Deep learning
Journal
68
Issue
ISSN
Citations 
7
1549-8328
1
PageRank 
References 
Authors
0.35
0
3
Name
Order
Citations
PageRank
Kangjun Bai1115.28
Lingjia Liu279992.58
Yi Yang3929.96