Title
VAE Learning via Stein Variational Gradient Descent.
Abstract
A new method for learning variational autoencoders (VAEs) is developed, based on Stein variational gradient descent. A key advantage of this approach is that one need not make parametric assumptions about the form of the encoder distribution. Performance is further enhanced by integrating the proposed encoder with importance sampling. Excellent performance is demonstrated across multiple unsupervised and semi-supervised problems, including semi-supervised analysis of the ImageNet data, demonstrating the scalability of the model to large datasets.
Year
Venue
Field
2017
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017)
Importance sampling,Gradient descent,Mathematical optimization,Computer science,Parametric statistics,Encoder,Artificial intelligence,Machine learning,Scalability
DocType
Volume
ISSN
Conference
30
1049-5258
Citations 
PageRank 
References 
12
0.64
27
Authors
6
Name
Order
Citations
PageRank
Pu, Yunchen1571.97
Zhe Gan231932.58
Ricardo Henao328623.85
Chunyuan Li446733.86
Han, Shaobo5161.06
L. Carin64603339.36