Title
Kernel Stein Discrepancy Descent
Abstract
Among dissimilarities between probability distributions, the Kernel Stein Discrepancy (KSD) has received much interest recently. We investigate the properties of its Wasserstein gradient flow to approximate a target probability distribution pi on R-d, known up to a normalization constant. This leads to a straightforwardly implementable, deterministic score-based method to sample from pi, named KSD Descent, which uses a set of particles to approximate pi. Remarkably, owing to a tractable loss function, KSD Descent can leverage robust parameter-free optimization schemes such as L-BFGS; this contrasts with other popular particle-based schemes such as the Stein Variational Gradient Descent algorithm. We study the convergence properties of KSD Descent and demonstrate its practical relevance. However, we also highlight failure cases by showing that the algorithm can get stuck in spurious local minima.
Year
Venue
DocType
2021
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139
Conference
Volume
ISSN
Citations 
139
2640-3498
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Anna Korba133.42
Aubin-Frankowski Pierre-Cyril200.68
Szymon Majewski300.34
Pierre Ablin494.61