Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Averaging Stochastic Gradient Descent on Riemannian Manifolds
 
conference paper

Averaging Stochastic Gradient Descent on Riemannian Manifolds

Tripuraneni, Nilesh
•
Flammarion, Nicolas  
•
Bach, Francis
Show more
2018
Proceedings of Machine Learning Research

We propose an estimator for the mean of a random vector in Rd that can be computed in time O(n3.5 + n2d) for n i.i.d. samples and that has error bounds matching the sub-Gaussian case. The only assumptions we make about the data distribution are that it has finite mean and covariance; in particular, we make no assumptions about higher-order moments. Like the polynomial time estimator introduced by Hopkins (2018), which is based on the sum-of-squares hierarchy, our estimator achieves optimal statistical efficiency in this challenging setting, but it has a significantly faster runtime and a simpler analysis.

  • Details
  • Metrics
Type
conference paper
Author(s)
Tripuraneni, Nilesh
Flammarion, Nicolas  
Bach, Francis
Jordan, Michael I.
Date Issued

2018

Published in
Proceedings of Machine Learning Research
Volume

75

Editorial or Peer reviewed

REVIEWED

Written at

OTHER

EPFL units
TML  
Available on Infoscience
December 2, 2019
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/163515
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés