Title
Style Neophile: Constantly Seeking Novel Styles for Domain Generalization
Abstract
This paper studies domain generalization via domain-invariant representation learning. Existing methods in this direction suppose that a domain can be characterized by styles of its images, and train a network using style-augmented data so that the network is not biased to particular style distributions. However, these methods are restricted to a finite set of styles since they obtain styles for augmentation from a fixed set of external images or by in-terpolating those of training data. To address this limitation and maximize the benefit of style augmentation, we propose a new method that synthesizes novel styles constantly during training. Our method manages multiple queues to store styles that have been observed so far, and synthesizes novel styles whose distribution is distinct from the distribution of styles in the queues. The style synthesis process is formu-lated as a monotone submodular optimization, thus can be conducted efficiently by a greedy algorithm. Extensive ex-periments on four public benchmarks demonstrate that the proposed method is capable of achieving state-of-the-art domain generalization performance.
Year
DOI
Venue
2022
10.1109/CVPR52688.2022.00699
IEEE Conference on Computer Vision and Pattern Recognition
Keywords
DocType
Volume
Transfer/low-shot/long-tail learning, Recognition: detection,categorization,retrieval
Conference
2022
Issue
Citations 
PageRank 
1
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Juwon Kang100.34
Sohyun Lee200.34
Namyup Kim300.34
Suha Kwak439720.33