Title
Variational Continual Learning.
Abstract
This paper develops variational continual learning (VCL), a simple but general framework for continual learning that fuses online variational inference (VI) and recent advances in Monte Carlo VI for neural networks. The framework can successfully train both deep discriminative models and deep generative models in complex continual learning settings where existing tasks evolve over time and entirely new tasks emerge. Experimental results show that VCL outperforms state-of-the-art continual learning methods on a variety of tasks, avoiding catastrophic forgetting in a fully automatic way.
Year
Venue
DocType
2018
ICLR
Conference
Volume
Citations 
PageRank 
abs/1710.10628
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Cuong Nguyen120735.89
Yingzhen Li28211.76
Bui, Thang D.3575.77
Richard E. Turner432237.95