Abstract | ||
---|---|---|
This article considers the popular MCMC method of unadjusted Langevin Monte Carlo (LMC) and provides a non-asymptotic analysis of its sampling error in 2-Wasserstein distance. The proof is based on a refinement of mean-square analysis in Li et al. (2019), and this refined framework automates the analysis of a large class of sampling algorithms based on discretizations of contractive SDEs. Using this framework, we establish an $\tilde{O}(\sqrt{d}/\epsilon)$ mixing time bound for LMC, without warm start, under the common log-smooth and log-strongly-convex conditions, plus a growth condition on the 3rd-order derivative of the potential of target measures. This bound improves the best previously known $\tilde{O}(d/\epsilon)$ result and is optimal (in terms of order) in both dimension $d$ and accuracy tolerance $\epsilon$ for target measures satisfying the aforementioned assumptions. Our theoretical analysis is further validated by numerical experiments. |
Year | Venue | Keywords |
---|---|---|
2022 | International Conference on Learning Representations (ICLR) | unadjusted Langevin algorithm / Langevin Monte Carlo,non-asymptotic sampling error in Wasserstein-2 distance,optimal dimension dependence,mean square analysis |
DocType | Citations | PageRank |
Conference | 0 | 0.34 |
References | Authors | |
0 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ruilin Li | 1 | 0 | 0.68 |
Hongyuan Zha | 2 | 0 | 0.34 |
Molei Tao | 3 | 16 | 5.64 |