Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Dimensionally Tight Bounds for Second-Order Hamiltonian Monte Carlo
 
conference paper

Dimensionally Tight Bounds for Second-Order Hamiltonian Monte Carlo

Mangoubi, Oren  
•
Vishnoi, Nisheeth K.  
January 1, 2018
Advances In Neural Information Processing Systems 31 (Nips 2018)
32nd Conference on Neural Information Processing Systems (NIPS)

Hamiltonian Monte Carlo (HMC) is a widely deployed method to sample from high-dimensional distributions in Statistics and Machine learning. HMC is known to run very efficiently in practice and its popular second-order "leapfrog" implementation has long been conjectured to run in d(1/4) gradient evaluations. Here we show that this conjecture is true when sampling from strongly log-concave target distributions that satisfy a weak third-order regularity property associated with the input data. Our regularity condition is weaker than the Lipschitz Hessian property and allows us to show faster convergence bounds for a much larger class of distributions than would be possible with the usual Lipschitz Hessian constant alone. Important distributions that satisfy our regularity condition include posterior distributions used in Bayesian logistic regression for which the data satisfies an "incoherence" property. Our result compares favorably with the best available bounds for the class of strongly log-concave distributions, which grow like d(1/2) gradient evaluations with the dimension. Moreover, our simulations on synthetic data suggest that, when our regularity condition is satisfied, leapfrog HMC performs better than its competitors - both in terms of accuracy and in terms of the number of gradient evaluations it requires.

  • Details
  • Metrics
Type
conference paper
Web of Science ID

WOS:000461852000052

Author(s)
Mangoubi, Oren  
Vishnoi, Nisheeth K.  
Date Issued

2018-01-01

Publisher

NEURAL INFORMATION PROCESSING SYSTEMS (NIPS)

Publisher place

La Jolla

Published in
Advances In Neural Information Processing Systems 31 (Nips 2018)
Series title/Series vol.

Advances in Neural Information Processing Systems

Volume

31

Subjects

Computer Science, Artificial Intelligence

•

Computer Science

•

logistic-regression

•

diffusion limits

•

algorithm

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LTHC  
THL3  
Event nameEvent placeEvent date
32nd Conference on Neural Information Processing Systems (NIPS)

Montreal, CANADA

Dec 02-08, 2018

Available on Infoscience
June 18, 2019
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/156864
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés