Title | ||
---|---|---|
Exploiting Process Variations to Protect Machine Learning Inference Engine from Chip Cloning |
Abstract | ||
---|---|---|
Machine learning inference engine is of great interest to smart edge computing. Compute-in-memory (CIM) architecture has shown significant improvements in throughput and energy efficiency for hardware acceleration. Emerging non-volatile memory (eNVM) technologies offer great potentials for instant on and off by dynamic power gating. Inference engine is typically pre-trained by the cloud and then being deployed to the field. There is a new security concern on cloning of the weights stored on eNVM-based CIM chip. In this paper, we propose a countermeasure to the weight cloning attack by exploiting the process variations of the periphery circuitry. In particular, we use weight fine-tuning to compensate the analog-to-digital converter (ADC) offset for a specific chip instance while inducing significant accuracy drop for cloned chip instances. We evaluate our proposed scheme on a CIFAR-10 classification task using a VGG-8 network. Our results show that with precisely chosen transistor size on the employed SAR-ADC, we could maintain 88%similar to 90% accuracy for the fine-tuned chip while the same set of weights cloned on other chips will only have 20 similar to 40% accuracy on average. The weight fine-tune could be completed within one epoch of 250 iterations. On average only 0.02%, 0.025%, 0.142% of cells are updated for 2-bit, 4-bit, 8-bit weight precisions in each iteration. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/ISCAS51556.2021.9401659 | 2021 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS) |
Keywords | DocType | ISSN |
Deep neural network, hardware accelerator, in-memory computing, non-volatile memory, hardware security | Conference | 0271-4302 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shanshi Huang | 1 | 15 | 6.75 |
Xiaochen Peng | 2 | 61 | 12.17 |
Hongwu Jiang | 3 | 16 | 6.77 |
Yandong Luo | 4 | 3 | 2.75 |
Shimeng Yu | 5 | 490 | 56.22 |