Title
Memorized Variational Continual Learning For Dirichlet Process Mixtures
Abstract
Bayesian nonparametric models are theoretically suitable for streaming data due to their ability to adapt model complexity with the observed data. However, very limited work has addressed posterior inference in a streaming fashion, and most of the existing variational inference algorithms require truncation on variational distributions which cannot vary with the data. In this paper, we focus Dirichlet process mixture models and develop the corresponding variational continual learning approach by maintaining memorized sufficient statistics for previous tasks, called memorized variational continual learning (MVCL), which is able to handle both the posterior update and data in a continual learning setting. Furthermore, we extend MVCL for two cases of mixture models which can handle different data types. The experiments demonstrate the comparable inference capability of our MVCL for both discrete and real-valued datasets with automatically inferring the number of mixture components.
Year
DOI
Venue
2019
10.1109/ACCESS.2019.2947722
IEEE ACCESS
Keywords
DocType
Volume
Data models, Task analysis, Bayes methods, Mixture models, Computational modeling, Inference algorithms, Approximation algorithms, Bayesian nonparametric, streaming data, variational continual learning, Dirichlet process mixture, memorized sufficient statistics, discrete and real-valued datasets
Journal
7
ISSN
Citations 
PageRank 
2169-3536
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Yang Yang1201.36
Bo Chen230434.22
Hongwei Liu341666.06