Title
A system-level perspective to understand the vulnerability of deep learning systems.
Abstract
Deep neural network (DNN) is nowadays achieving the human-level performance on many machine learning applications like self-driving car, gaming and computer-aided diagnosis. However, recent studies show that such a promising technique has gradually become the major attack target, significantly threatening the safety of machine learning services. On one hand, the adversarial or poisoning attacks incurred by DNN algorithm vulnerabilities can cause the decision misleading with very high confidence. On the other hand, the system-level DNN attacks built upon models, training/inference algorithms and hardware and software in DNN execution, have also emerged for more diversified damages like denial of service, private data stealing. In this paper, we present an overview of such emerging system-level DNN attacks by systematically formulating their attack routines. Several representative cases are selected in our study to summarize the characteristics of system-level DNN attacks. Based on our formulation, we further discuss the challenges and several possible techniques to mitigate such emerging system-level DNN attacks.
Year
DOI
Venue
2019
10.1145/3287624.3288751
ASP-DAC
Keywords
Field
DocType
DNN, machine learning, mitigation, security, system-level
Denial-of-service attack,Computer science,Inference,Computer security,Real-time computing,Software,Artificial intelligence,Deep learning,Artificial neural network,System level,Vulnerability
Conference
Citations 
PageRank 
References 
1
0.37
11
Authors
5
Name
Order
Citations
PageRank
Tao Liu1457.40
Nuo Xu2155.31
Qi Liu3173.67
Yanzhi Wang41082136.11
Wujie Wen530030.61