Title
Conditional Generative Models are not Robust.
Abstract
Class-conditional generative models are an increasingly popular approach to achieve robust classification. They are a natural choice to solve discriminative tasks in a robust manner as they jointly optimize for predictive performance and accurate modeling of the input distribution. In this work, we investigate robust classification with likelihood-based conditional generative models from a theoretical and practical perspective. Our theoretical result reveals that it is impossible to guarantee detectability of adversarial examples even for near-optimal generative classifiers. Experimentally, we show that naively trained conditional generative models have poor discriminative performance, making them unsuitable for classification. This is related to overlooked issues with training conditional generative models and we show methods to improve performance. Finally, we analyze the robustness of our proposed conditional generative models on MNIST and CIFAR10. While we are able to train robust models for MNIST, robustness completely breaks down on CIFAR10. This lack of robustness is related to various undesirable model properties maximum likelihood fails to penalize. Our results indicate that likelihood may fundamentally be at odds with robust classification on challenging problems.
Year
Venue
DocType
2019
CoRR
Journal
Volume
Citations 
PageRank 
abs/1906.01171
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Ethan Fetaya19612.90
Jörn-Henrik Jacobsen200.68
Richard S. Zemel34958425.68