Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. An Accelerated First-Order Method for Non-convex Optimization on Manifolds
 
research article

An Accelerated First-Order Method for Non-convex Optimization on Manifolds

Criscitiello, Christopher  
•
Boumal, Nicolas  
June 8, 2022
Foundations Of Computational Mathematics

We describe the first gradient methods on Riemannian manifolds to achieve accelerated rates in the non-convex case. Under Lipschitz assumptions on the Riemannian gradient and Hessian of the cost function, these methods find approximate first-order critical points faster than regular gradient descent. A randomized version also finds approximate second-order critical points. Both the algorithms and their analyses build extensively on existing work in the Euclidean case. The basic operation consists in running the Euclidean accelerated gradient descent method (appropriately safe-guarded against non-convexity) in the current tangent space, then moving back to the manifold and repeating. This requires lifting the cost function from the manifold to the tangent space, which can be done for example through the Riemannian exponential map. For this approach to succeed, the lifted cost function (called the pullback) must retain certain Lipschitz properties. As a contribution of independent interest, we prove precise claims to that effect, with explicit constants. Those claims are affected by the Riemannian curvature of the manifold, which in turn affects the worst-case complexity bounds for our optimization algorithms.

  • Files
  • Details
  • Metrics
Type
research article
DOI
10.1007/s10208-022-09573-9
Web of Science ID

WOS:000807961400001

Author(s)
Criscitiello, Christopher  
Boumal, Nicolas  
Date Issued

2022-06-08

Publisher

SPRINGER

Published in
Foundations Of Computational Mathematics
Subjects

Computer Science, Theory & Methods

•

Mathematics, Applied

•

Mathematics

•

Computer Science

•

optimization on manifolds

•

accelerated gradient descent

•

non-convex optimization

•

first-order method

•

riemannian manifold

•

jacobi field

•

curvature

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
OPTIM  
Available on Infoscience
July 4, 2022
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/188826
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés