Li, Yen-HuanCevher, Volkan2018-12-072018-12-072018-12-07201910.1007/s10957-018-1428-9https://infoscience.epfl.ch/handle/20.500.14299/151706Consider the problem of minimizing a convex differentiable function on the probability simplex, spectrahedron, or set of quantum density matrices. We prove that the expo-nentiated gradient method with Armijo line search always converges to the optimum, if the sequence of the iterates possesses a strictly positive limit point (element-wise for the vector case, and with respect to the Löwner partial ordering for the matrix case). To the best of our knowledge, this is the first convergence result for a mirror descent-type method that only requires differentiability. The proof exploits self-concordant likeness of the l og-partition function, which is of independent interest.Exponentiated gradient methodArmijo line searchSelf-concordant likenessPeierls–Bogoliubov inequalityConvergence of the Exponentiated Gradient Method with Armijo Line Searchtext::journal::journal article::research article