Title
SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence
Abstract
Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the (kernelized) gradient flow of the chi-squared divergence which, we show, exhibits a strong form of uniform exponential ergodicity under conditions as weak as a Poincar\'e inequality. This perspective leads us to propose an alternative to SVGD, called Laplacian Adjusted Wasserstein Gradient Descent (LAWGD), that can be implemented from the spectral decomposition of the Laplacian operator associated with the target density. We show that LAWGD exhibits strong convergence guarantees and good practical performance.
Year
Venue
DocType
2020
NeurIPS
Conference
Volume
Citations 
PageRank 
33
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Sinho Chewi103.04
Thibaut Le Gouic203.04
Lu Chen300.68
Tyler Maunu4122.91
Philippe Rigollet522019.44