Convergence of the Exponentiated Gradient Method with Armijo Line Search
Consider the problem of minimizing a convex differentiable function on the probability simplex, spectrahedron, or set of quantum density matrices. We prove that the expo-nentiated gradient method with Armijo line search always converges to the optimum, if the sequence of the iterates possesses a strictly positive limit point (element-wise for the vector case, and with respect to the Löwner partial ordering for the matrix case). To the best of our knowledge, this is the first convergence result for a mirror descent-type method that only requires differentiability. The proof exploits self-concordant likeness of the l og-partition function, which is of independent interest.
jota_paper_postprint.pdf
Postprint
openaccess
209.19 KB
Adobe PDF
fe7902b3e7ab9b923adc719b1f41bd68
Jota18.pdf
openaccess
209.39 KB
Adobe PDF
dd7800701a0df01be8558eb2d6128bb0