Title
Overcoming Catastrophic Forgetting With Unlabeled Data in the Wild
Abstract
Lifelong learning with deep neural networks is well-known to suffer from catastrophic forgetting: the performance on previous tasks drastically degrades when learning a new task. To alleviate this effect, we propose to leverage a large stream of unlabeled data easily obtainable in the wild. In particular, we design a novel class-incremental learning scheme with (a) a new distillation loss, termed global distillation, (b) a learning strategy to avoid overfitting to the most recent task, and (c) a confidence-based sampling method to effectively leverage unlabeled external data. Our experimental results on various datasets, including CIFAR and ImageNet, demonstrate the superiority of the proposed methods over prior methods, particularly when a stream of unlabeled data is accessible: our method shows up to 15.8% higher accuracy and 46.5% less forgetting compared to the state-of-the-art method. The code is available at https://github.com/kibok90/iccv2019-inc.
Year
DOI
Venue
2019
10.1109/ICCV.2019.00040
2019 IEEE/CVF International Conference on Computer Vision (ICCV)
Keywords
Field
DocType
distillation loss,global distillation,learning strategy,catastrophic forgetting,lifelong learning,deep neural networks,class-incremental learning scheme,unlabeled external data,confidence-based sampling method,CIFAR dataset,ImageNet dataset
Forgetting,Pattern recognition,Computer science,Speech recognition,Artificial intelligence
Conference
Volume
Issue
ISSN
2019
1
1550-5499
ISBN
Citations 
PageRank 
978-1-7281-4804-5
7
0.49
References 
Authors
7
4
Name
Order
Citations
PageRank
Kibok Lee1685.16
Kimin Lee25111.57
Jinwoo Shin351356.35
Honglak Lee46247398.39