Title | ||
---|---|---|
Towards Verification-Aware Knowledge Distillation for Neural-Network Controlled Systems: Invited Paper |
Abstract | ||
---|---|---|
Neural networks are widely used in many applications ranging from classification to control. While these networks are composed of simple arithmetic operations, they are challenging to formally verify for properties such as reachability due to the presence of nonlinear activation functions. In this paper, we make the observation that Lipschitz continuity of a neural network not only can play a major role in the construction of reachable sets for neural-network controlled systems but also can be systematically controlled during training of the neural network. We build on this observation to develop a novel verification-aware knowledge distillation framework that transfers the knowledge of a trained network to a new and easier-to-verify network. Experimental results show that our method can substantially improve reachability analysis of neural-network controlled systems for several state-of-the-art tools. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1109/ICCAD45719.2019.8942059 | 2019 IEEE/ACM International Conference on Computer-Aided Design (ICCAD) |
Keywords | Field | DocType |
trained network,easier-to-verify network,verification-aware knowledge distillation framework,neural network controlled systems,reachability analysis,Lipschitz continuity | Nonlinear system,Computer science,Real-time computing,Theoretical computer science,Reachability,Ranging,Distillation,Lipschitz continuity,Artificial neural network | Conference |
ISSN | ISBN | Citations |
1933-7760 | 978-1-7281-2351-6 | 1 |
PageRank | References | Authors |
0.35 | 8 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jiameng Fan | 1 | 13 | 2.56 |
Chao Huang | 2 | 103 | 30.94 |
Wenchao Li | 3 | 100 | 13.25 |
Xin Chen | 4 | 36 | 9.22 |
Qi Zhu | 5 | 8 | 5.55 |