On the linear convergence of the stochastic gradient method with constant step-size

The strong growth condition (SGC) is known to be a sufficient condition for linear convergence of the stochastic gradient method using a constant step-size γ (SGM-CS). In this paper, we provide a necessary condition, for the linear convergence of SGM-CS, that is weaker than SGC. Moreover, when this necessary is violated up to a additive perturbation σ, we show that both the projected stochastic gradient method using a constant step-size, under the restricted strong convexity assumption, and the proximal stochastic gradient method, under the strong convexity assumption, exhibit linear convergence to a noise dominated region, whose distance to the optimal solution is proportional to γσ.


Publié dans:
Optimization Letters, 12, 1-11
Année
Sep 25 2018
Mots-clefs:
Laboratoires:




 Notice créée le 2018-11-02, modifiée le 2019-06-19

Fichiers:
Télécharger le document
PDF

Évaluer ce document:

Rate this document:
1
2
3
 
(Pas encore évalué)