Title | ||
---|---|---|
15.2 A 28nm 64kb Inference-Training Two-Way Transpose Multibit 6t Sram Compute-In-Memory Macro For Ai Edge Chips |
Abstract | ||
---|---|---|
Many Al edge devices require local intelligence to achieve fast computing time (t AC ), high energy efficiency (EF), and privacy. The transfer-learning approach is a popular solution for Al edge chips, wherein data used to re-train the Al in the cloud is used to fine-tune (re-train) a few of the neural layers in edge devices. This enables the dynamic incorporation of data from in-situ environments or private information. Computing-in-memory (CIM) is a promising approach to improve EF for Al edge chips, existing CIM schemes support inference [1]–[5] with forward (FWD) propagation; however, they do not support training, requiring both FWD and backward (BWD) propagation, due to differences in weight-access flow for FWD and BWD propagation. As Fig. 15.2.1 shows, efforts to increase the precision of the input (IN), weight (W), and/or output (OUT) tend to degrade r AC and EF for training operations irrespective of scheme: digital FWD and BWD (DF-DB) or CIM-FWD-digital-BWD (CiMF-DB). This work develops a two-way transpose (TWT) SRAM-CIM macro supporting multibit MAC operations for FWD and BWD propagation with fast r AC and high EF within a compact area. The proposed scheme features (1) A TWT multiply cell (TWT-MC) with a high resistance to process variation; and (2) a small-offset gain-enhancement sense amplifier (SOGE-SA) to tolerate a small read margin. A 28nm 64Kb TWT SRAM-CIM macro was fabricated using a foundry-provided compact 6T-SRAM cell for SRAM-CIM devices supporting both inference and training operations for the first time. This macro also demonstrates the fastest t AC (3.8 – 21ns) and highest EF (7 – 61.1TOPS/w) for MAC operations using 2 – 8b inputs, 4 – 8b weights and 12 − 20b outputs. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1109/ISSCC19947.2020.9062949 | 2020 IEEE INTERNATIONAL SOLID- STATE CIRCUITS CONFERENCE (ISSCC) |
DocType | ISSN | Citations |
Conference | 0193-6530 | 0 |
PageRank | References | Authors |
0.34 | 0 | 23 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jian-Wei Su | 1 | 13 | 3.61 |
Xin Si | 2 | 49 | 6.86 |
Yen-Chi Chou | 3 | 8 | 2.20 |
Ting-Wei Chang | 4 | 24 | 5.30 |
Wei-Hsing Huang | 5 | 25 | 2.56 |
Yung-Ning Tu | 6 | 26 | 2.92 |
Ruhui Liu | 7 | 2 | 0.73 |
Pei-Jung Lu | 8 | 7 | 1.81 |
Ta-Wei Liu | 9 | 7 | 2.83 |
Jing-Hong Wang | 10 | 31 | 4.03 |
Zhixiao Zhang | 11 | 8 | 2.87 |
Hongwu Jiang | 12 | 16 | 6.77 |
Shanshi Huang | 13 | 15 | 6.75 |
Chung-Chuan Lo | 14 | 15 | 7.24 |
Ren-Shuo Liu | 15 | 141 | 9.86 |
Chih-Cheng Hsieh | 16 | 218 | 44.84 |
Kea-Tiong Tang | 17 | 109 | 28.91 |
Shyh-Shyuan Sheu | 18 | 211 | 23.45 |
Sih-Han Li | 19 | 7 | 1.80 |
Heng-Yuan Lee | 20 | 228 | 20.66 |
Shih-Chieh Chang | 21 | 641 | 52.31 |
Shimeng Yu | 22 | 490 | 56.22 |
Meng-Fan Chang | 23 | 459 | 45.63 |