Title
Flexible Prior Distributions for Deep Generative Models.
Abstract
We consider the problem of training generative models with deep neural networks as generators, i.e. to map latent codes to data points. Whereas the dominant paradigm combines simple priors over codes with complex deterministic models, we argue that it might be advantageous to use more flexible code distributions. We demonstrate how these distributions can be induced directly from the data. The benefits include: more powerful generative models, better modeling of latent structure and explicit control of the degree of generalization.
Year
Venue
Field
2017
arXiv: Learning
Data point,Generative topographic map,Computer science,Artificial intelligence,Generative grammar,Prior probability,Machine learning,Deep neural networks,Generative model
DocType
Volume
Citations 
Journal
abs/1710.11383
0
PageRank 
References 
Authors
0.34
10
3
Name
Order
Citations
PageRank
Yannic Kilcher184.28
Aurelien Lucchi2241989.45
Thomas Hofmann331.72