Nascimento, Vitor HSayed, Ali H.2017-12-192017-12-192017-12-191998https://infoscience.epfl.ch/handle/20.500.14299/143119This work studies the mean-square stability of stochastic gradient algorithms without resorting to slow adaptation approximations or to the widely used, yet rarely applicable, independence assumptions. This is achieved by reducing the study of the mean-square convergence of an adaptive filter to the study of the exponential stability of a linear time-in variant state equation. The size of the coefficient matrix of the state equation, however, turns out to grow exponentially fast with the length of the filter so that it becomes computationally infeasible to manipulate the matrix directly. It is instead shown that the coefficient matrix is sparse and has structure. By exploiting these two properties, and by applying a sequence of carefully chosen similarity transformations to the coefficient matrix, an upper bound on the step-size is found that guarantees stability.Stability of the LMS adaptive filter by means of a state equationtext::conference output::conference proceedings::conference paper