Abstract | ||
---|---|---|
This work addresses continual learning for non-stationary data, using Bayesian neural networks and memory-based online variational Bayes. We represent the posterior approximation of the network weights by a diagonal Gaussian distribution and a complementary memory of raw data. This raw data corresponds to likelihood terms that cannot be well approximated by the Gaussian. We introduce a novel method for sequentially updating both components of the posterior approximation. Furthermore, we propose Bayesian forgetting and a Gaussian diffusion process for adapting to non-stationary data. The experimental results show that our update method improves on existing approaches for streaming data. Additionally, the adaptation methods lead to better predictive performance for non-stationary data. |
Year | Venue | Keywords |
---|---|---|
2020 | ICLR | Continual Learning, Online Variational Bayes, Non-Stationary Data, Bayesian Neural Networks, Variational Inference, Lifelong Learning, Concept Drift, Episodic Memory |
DocType | Citations | PageRank |
Conference | 0 | 0.34 |
References | Authors | |
31 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Richard Kurle | 1 | 0 | 2.70 |
Botond Cseke | 2 | 193 | 11.55 |
Alexej Klushyn | 3 | 0 | 1.69 |
Patrick van der Smagt | 4 | 188 | 24.23 |
Stephan G眉nnemann | 5 | 833 | 69.26 |