Loading...
conference paper
Averaging Stochastic Gradient Descent on Riemannian Manifolds
2018
Proceedings of Machine Learning Research
We propose an estimator for the mean of a random vector in Rd that can be computed in time O(n3.5 + n2d) for n i.i.d. samples and that has error bounds matching the sub-Gaussian case. The only assumptions we make about the data distribution are that it has finite mean and covariance; in particular, we make no assumptions about higher-order moments. Like the polynomial time estimator introduced by Hopkins (2018), which is based on the sum-of-squares hierarchy, our estimator achieves optimal statistical efficiency in this challenging setting, but it has a significantly faster runtime and a simpler analysis.
Type
conference paper
Authors
Publication date
2018
Published in
Proceedings of Machine Learning Research
Volume
75
Peer reviewed
REVIEWED
EPFL units
Available on Infoscience
December 2, 2019
Use this identifier to reference this record