On the linear convergence of the stochastic gradient method with constant step-size
The strong growth condition (SGC) is known to be a sufficient condition for linear convergence of the stochastic gradient method using a constant step-size γ (SGM-CS). In this paper, we provide a necessary condition, for the linear convergence of SGM-CS, that is weaker than SGC. Moreover, when this necessary is violated up to a additive perturbation σ, we show that both the projected stochastic gradient method using a constant step-size, under the restricted strong convexity assumption, and the proximal stochastic gradient method, under the strong convexity assumption, exhibit linear convergence to a noise dominated region, whose distance to the optimal solution is proportional to γσ.
linearSPG_postprint.pdf
Postprint
openaccess
222.1 KB
Adobe PDF
4c406e8adc60d7da920279bffad8c144
linearSPG.pdf
openaccess
199.64 KB
Adobe PDF
1f3371b502b4365aaf3f8cec1cf92cf9