Title
Incremental Learning with Unlabeled Data in the Wild.
Abstract
Deep neural networks are known to suffer from catastrophic forgetting in class-incremental learning, where the performance on previous tasks drastically degrades when learning a new task. To alleviate this effect, we propose to leverage a continuous and large stream of unlabeled data in the wild. In particular, to leverage such transient external data effectively, we design a novel class-incremental learning scheme with (a) a new distillation loss, termed global distillation, (b) a learning strategy to avoid overfitting to the most recent task, and (c) a sampling strategy for the desired external data. Our experimental results on various datasets, including CIFAR and ImageNet, demonstrate the superiority of the proposed methods over prior methods, particularly when a stream of unlabeled data is accessible: we achieve up to 9.3% of relative performance improvement compared to the state-of-the-art method.
Year
Venue
Field
2019
CVPR Workshops
Forgetting,Computer science,Incremental learning,Distillation,Artificial intelligence,Sampling (statistics),Overfitting,Deep neural networks,Machine learning,Performance improvement
DocType
Volume
Citations 
Journal
abs/1903.12648
1
PageRank 
References 
Authors
0.35
0
4
Name
Order
Citations
PageRank
Kibok Lee1685.16
Kimin Lee25111.57
Jinwoo Shin351356.35
Honglak Lee46247398.39