Title
Latent Space Oddity: on the Curvature of Deep Generative Models
Abstract
Deep generative models provide a systematic way to learn nonlinear data distributions through a set of latent variables and a nonlinear function that maps latent points into the input space. The nonlinearity of the generator implies that the latent space gives a distorted view of the input space. Under mild conditions, we show that this distortion can be characterized by a stochastic Riemannian metric, and we demonstrate that distances and interpolants are significantly improved under this metric. This in turn improves probability distributions, sampling algorithms and clustering in the latent space. Our geometric analysis further reveals that current generators provide poor variance estimates and we propose a new generator architecture with vastly improved variance estimates. Results are demonstrated on convolutional and fully connected variational autoencoders, but the formalism easily generalizes to other deep generative models.
Year
Venue
Field
2018
international conference on learning representations
Curvature,Nonlinear system,Geometric analysis,Latent variable,Probability distribution,Artificial intelligence,Cluster analysis,Distortion,Mathematics,Machine learning,Gibbs sampling
DocType
Citations 
PageRank 
Conference
8
0.53
References 
Authors
8
3
Name
Order
Citations
PageRank
Georgios Arvanitidis1101.91
Lars Kai Hansen22776341.03
Soren Hauberg323024.36