Abstract | ||
---|---|---|
Conventional wisdom in the sampling literature, backed by a popular diffusion scaling limit, suggests that the mixing time of the Metropolis-Adjusted Langevin Algorithm (MALA) scales as $O(d^{1/3})$, where $d$ is the dimension. However, the diffusion scaling limit requires stringent assumptions on the target distribution and is asymptotic in nature. In contrast, the best known non-asymptotic mixing time bound for MALA on the class of log-smooth and strongly log-concave distributions is $O(d)$. In this work, we establish that the mixing time of MALA on this class of target distributions is $\widetilde\Theta(d^{1/2})$ under a warm start. Our upper bound proof introduces a new technique based on a projection characterization of the Metropolis adjustment which reduces the study of MALA to the well-studied discretization analysis of the Langevin SDE and bypasses direct computation of the acceptance probability. |
Year | Venue | DocType |
---|---|---|
2021 | COLT | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sinho Chewi | 1 | 0 | 3.04 |
Chen Lu | 2 | 3 | 2.10 |
Kwangjun Ahn | 3 | 0 | 0.68 |
Xiang Cheng | 4 | 22 | 4.81 |
Thibaut Le Gouic | 5 | 0 | 3.04 |
Philippe Rigollet | 6 | 220 | 19.44 |