Title
Self-Supervised Gans Via Auxiliary Rotation Loss
Abstract
Conditional GANs are at the forefront of natural image synthesis. The main drawback of such models is the necessity for labeled data. In this work we exploit two popular unsupervised learning techniques, adversarial training and self-supervision, and take a step towards bridging the gap between conditional and unconditional GANs. In particular, we allow the networks to collaborate on the task of representation learning, while being adversarial with respect to the classic GAN game. The role of self-supervision is to encourage the discriminator to learn meaningful feature representations which are not forgotten during training. We test empirically both the quality of the learned image representations, and the quality of the synthesized images. Under the same conditions, the self-supervised GAN attains a similar performance to state-of-the-art conditional counterparts. Finally, we show that this approach to fully unsupervised learning can be scaled to attain an FID of 23.4 on unconditional IMAGENET generation.
Year
DOI
Venue
2019
10.1109/CVPR.2019.01243
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019)
Field
DocType
ISSN
Drawback,Discriminator,Bridging (networking),Exploit,Unsupervised learning,Artificial intelligence,Labeled data,Mathematics,Feature learning,Machine learning,Adversarial system
Conference
1063-6919
Citations 
PageRank 
References 
7
0.41
0
Authors
5
Name
Order
Citations
PageRank
Ting Chen113813.81
Xiaohua Zhai220913.00
Marvin Ritter3142.52
Mario Lucic423116.10
Neil Houlsby515314.73