Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Preprints and Working Papers
  4. Accelerated SGD for Non-Strongly-Convex Least Squares
 
working paper

Accelerated SGD for Non-Strongly-Convex Least Squares

Varre, Aditya Vardhan  
•
Flammarion, Nicolas  
March 3, 2022

We consider stochastic approximation for the least squares regression problem in the non-strongly convex setting. We present the first practical algorithm that achieves the optimal prediction error rates in terms of dependence on the noise of the problem, as $O(d/t)$ while accelerating the forgetting of the initial conditions to $O(d/t^2)$. Our new algorithm is based on a simple modification of the accelerated gradient descent. We provide convergence results for both the averaged and the last iterate of the algorithm. In order to describe the tightness of these new bounds, we present a matching lower bound in the noiseless setting and thus show the optimality of our algorithm.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

2203.01744.pdf

Type

Preprint

Version

Submitted version (Preprint)

Access type

openaccess

License Condition

CC BY

Size

562.66 KB

Format

Adobe PDF

Checksum (MD5)

364ceecb8481f494c6af221d4293dab5

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés