Title
Towards Verification-Aware Knowledge Distillation for Neural-Network Controlled Systems: Invited Paper
Abstract
Neural networks are widely used in many applications ranging from classification to control. While these networks are composed of simple arithmetic operations, they are challenging to formally verify for properties such as reachability due to the presence of nonlinear activation functions. In this paper, we make the observation that Lipschitz continuity of a neural network not only can play a major role in the construction of reachable sets for neural-network controlled systems but also can be systematically controlled during training of the neural network. We build on this observation to develop a novel verification-aware knowledge distillation framework that transfers the knowledge of a trained network to a new and easier-to-verify network. Experimental results show that our method can substantially improve reachability analysis of neural-network controlled systems for several state-of-the-art tools.
Year
DOI
Venue
2019
10.1109/ICCAD45719.2019.8942059
2019 IEEE/ACM International Conference on Computer-Aided Design (ICCAD)
Keywords
Field
DocType
trained network,easier-to-verify network,verification-aware knowledge distillation framework,neural network controlled systems,reachability analysis,Lipschitz continuity
Nonlinear system,Computer science,Real-time computing,Theoretical computer science,Reachability,Ranging,Distillation,Lipschitz continuity,Artificial neural network
Conference
ISSN
ISBN
Citations 
1933-7760
978-1-7281-2351-6
1
PageRank 
References 
Authors
0.35
8
5
Name
Order
Citations
PageRank
Jiameng Fan1132.56
Chao Huang210330.94
Wenchao Li310013.25
Xin Chen4369.22
Qi Zhu585.55