Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. On the linear convergence of the stochastic gradient method with constant step-size
 
research article

On the linear convergence of the stochastic gradient method with constant step-size

Cevher, Volkan  orcid-logo
•
Vu, Cong Bang  
September 25, 2018
Optimization Letters

The strong growth condition (SGC) is known to be a sufficient condition for linear convergence of the stochastic gradient method using a constant step-size γ (SGM-CS). In this paper, we provide a necessary condition, for the linear convergence of SGM-CS, that is weaker than SGC. Moreover, when this necessary is violated up to a additive perturbation σ, we show that both the projected stochastic gradient method using a constant step-size, under the restricted strong convexity assumption, and the proximal stochastic gradient method, under the strong convexity assumption, exhibit linear convergence to a noise dominated region, whose distance to the optimal solution is proportional to γσ.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

linearSPG_postprint.pdf

Type

Postprint

Version

Accepted version

Access type

openaccess

Size

222.1 KB

Format

Adobe PDF

Checksum (MD5)

4c406e8adc60d7da920279bffad8c144

Loading...
Thumbnail Image
Name

linearSPG.pdf

Access type

openaccess

Size

199.64 KB

Format

Adobe PDF

Checksum (MD5)

1f3371b502b4365aaf3f8cec1cf92cf9

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés