Cevher, VolkanVu, Cong Bang2018-11-022018-11-022018-11-022018-09-2510.1007/s11590-018-1331-1https://infoscience.epfl.ch/handle/20.500.14299/149623The strong growth condition (SGC) is known to be a sufficient condition for linear convergence of the stochastic gradient method using a constant step-size γ (SGM-CS). In this paper, we provide a necessary condition, for the linear convergence of SGM-CS, that is weaker than SGC. Moreover, when this necessary is violated up to a additive perturbation σ, we show that both the projected stochastic gradient method using a constant step-size, under the restricted strong convexity assumption, and the proximal stochastic gradient method, under the strong convexity assumption, exhibit linear convergence to a noise dominated region, whose distance to the optimal solution is proportional to γσ.Stochastic gradientLinear convergenceStrong growth conditionOn the linear convergence of the stochastic gradient method with constant step-sizetext::journal::journal article::research article