Title
Tensor Monte Carlo: Particle Methods for the GPU era
Abstract
Multi-sample, importance-weighted variational autoencoders (IWAE) give tighter bounds and more accurate uncertainty estimates than variational autoencoders (VAEs) trained with a standard single-sample objective. However, IWAEs scale poorly: as the latent dimensionality grows, they require exponentially many samples to retain the benefits of importance weighting. While sequential Monte-Carlo (SMC) can address this problem, it is prohibitively slow because the resampling step imposes sequential structure which cannot be parallelised, and moreover, resampling is non-differentiable which is problematic when learning approximate posteriors. To address these issues, we developed tensor Monte-Carlo (TMC) which gives exponentially many importance samples by separately drawing K samples for each of the n latent variables, then averaging over all K-n possible combinations. While the sum over exponentially many terms might seem to be intractable, in many cases it can be computed efficiently as a series of tensor inner-products. We show that TMC is superior to IWAE on a generative model with multiple stochastic layers trained on the MNIST handwritten digit database, and we show that TMC can be combined with standard variance reduction techniques.
Year
Venue
Keywords
2018
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019)
generative model
Field
DocType
Volume
Mathematical optimization,Monte Carlo method,Weighting,Importance sampling,Tensor,Algorithm,Latent variable,Curse of dimensionality,Resampling,Message passing,Mathematics
Journal
32
ISSN
Citations 
PageRank 
1049-5258
0
0.34
References 
Authors
12
1
Name
Order
Citations
PageRank
Aitchison, Laurence1207.00