Loading...
conference paper
Fast Mean Estimation with Sub-Gaussian Rates
2019
Proceedings of Machine Learning Research
We present an improved analysis of the Euler-Maruyama discretization of the Langevin diffusion. Our analysis does not require global contractivity, and yields polynomial dependence on the time horizon. Compared to existing approaches, we make an additional smoothness assumption, and improve the existing rate from $O(\eta)$ to $O(\eta^2)$ in terms of the KL divergence. This result matches the correct order for numerical SDEs, without suffering from exponential time dependence. When applied to algorithms for sampling and learning, this result simultaneously improves all those methods based on Dalayan's approach.
Type
conference paper
Authors
Publication date
2019
Published in
Proceedings of Machine Learning Research
Volume
99
Peer reviewed
REVIEWED
EPFL units
Available on Infoscience
December 2, 2019
Use this identifier to reference this record