Mirrored Langevin Dynamics

We consider the problem of sampling from constrained distributions, which has posed significant challenges to both non-asymptotic analysis and algorithmic design. We propose a unified framework, which is inspired by the classical mirror descent, to derive novel first-order sampling schemes. We prove that, for a general target distribution with strongly convex potential, our framework implies the existence of a first-order algorithm achieving (O) over tilde (epsilon(-2)d) convergence, suggesting that the state-of-the-art (O) over tilde (epsilon(-6)d(5)) can be vastly improved. With the important Latent Dirichlet Allocation (LDA) application in mind, we specialize our algorithm to sample from Dirichlet posteriors, and derive the first non-asymptotic (O) over tilde (epsilon(-2)d(2)) rate for first-order sampling. We further extend our framework to the mini-batch setting and prove convergence rates when only stochastic gradients are available. Finally, we report promising experimental results for LDA on real datasets.


Published in:
Advances In Neural Information Processing Systems 31 (Nips 2018), 31
Presented at:
32nd Conference on Neural Information Processing Systems (NIPS), Montreal, CANADA, Dec 02-08, 2018
Year:
Jan 01 2018
Publisher:
La Jolla, NEURAL INFORMATION PROCESSING SYSTEMS (NIPS)
ISSN:
1049-5258
Keywords:
Laboratories:




 Record created 2019-06-18, last modified 2020-04-20


Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)