Title
Partitioned Variational Inference: A unified framework encompassing federated and continual learning.
Abstract
Variational inference (VI) has become the method of choice for fitting many modern probabilistic models. However, practitioners are faced with a fragmented literature that offers a bewildering array of algorithmic options. First, the variational family. Second, the granularity of the updates e.g. whether the updates are local to each data point and employ message passing or global. Third, the method of optimization (bespoke or blackbox, closed-form or stochastic updates, etc.). This paper presents a new framework, termed Partitioned Variational Inference (PVI), that explicitly acknowledges these algorithmic dimensions of VI, unifies disparate literature, and provides guidance on usage. Crucially, the proposed PVI framework allows us to identify new ways of performing VI that are ideally suited to challenging learning scenarios including federated learning (where distributed computing is leveraged to process non-centralized data) and continual learning (where new data and tasks arrive over time and must be accommodated quickly). We showcase these new capabilities by developing communication-efficient federated training of Bayesian neural networks and continual learning for Gaussian process models with private pseudo-points. The new methods significantly outperform the state-of-the-art, whilst being almost as straightforward to implement as standard VI.
Year
Venue
DocType
2018
arXiv: Machine Learning
Journal
Volume
Citations 
PageRank 
abs/1811.11206
1
0.35
References 
Authors
0
4
Name
Order
Citations
PageRank
Bui, Thang D.1575.77
Cuong Nguyen220735.89
Siddharth Swaroop333.06
Richard E. Turner432237.95