Title
Diverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks
Abstract
This paper proposes an ensemble learning model that is resistant to adversarial attacks. To build resilience, we introduced a training process where each member learns a radically distinct latent space. Member models are added one at a time to the ensemble. Simultaneously, the loss function is regulated by a reverse knowledge distillation, forcing the new member to learn different features and map...
Year
DOI
Venue
2021
10.1109/ISQED51717.2021.9424353
2021 22nd International Symposium on Quality Electronic Design (ISQED)
Keywords
DocType
ISSN
Resistance,Training,Resists,Feature extraction,Extraterrestrial measurements,Robustness,Security
Conference
1948-3287
ISBN
Citations 
PageRank 
978-1-7281-7641-3
1
0.38
References 
Authors
0
5
Name
Order
Citations
PageRank
Ali Mirzaeian121.07
Jana Kosecká21523129.85
Houman Homayoun357969.64
Tinoosh Mohsenin410.38
Avesta Sasan522828.57