Title
Unsupervised Continual Learning for Gradually Varying Domains
Abstract
In Unsupervised Domain Adaptation (UDA), a network is trained on a source domain and adapted on a target domain where no labeled data is available. Existing UDA techniques consider having the entire target domain available at once, which may not be feasible during deployment in realistic settings where batches of target data are acquired over time. Continual Learning (CL) has been dealing with data constrained paradigms in a supervised manner, where batches of labeled samples are sequentially presented to the network and the network continually learns from the new data without forgetting what was previously learned. Our method for unsupervised continual learning serves as a bridge between the UDA and CL paradigms. This research addresses a gradually evolving target domain fragmented into multiple sequential batches where the model continually adapts to the gradually varying stream of data in an unsupervised manner. To tackle this challenge, we propose a source free method based on episodic memory replay with buffer management. A contrastive loss is incorporated for better alignment of the buffer samples and the continual stream of batches. Our experiments on the rotating MNIST and CORe50 datasets confirm the benefits of our unsupervised continual learning method for gradually varying domains. The codes are available at https://github.com/abutaufique/ucl-gv.git.
Year
DOI
Venue
2022
10.1109/CVPRW56347.2022.00418
IEEE Conference on Computer Vision and Pattern Recognition
DocType
Volume
Issue
Conference
2022
1
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Abu Md Niamul Taufique100.68
Chowdhury Sadman Jahan200.68
Andreas Savakis337741.10