Convergence of the Exponentiated Gradient Method with Armijo Line Search

Consider the problem of minimizing a convex differentiable function on the probability simplex, spectrahedron, or set of quantum density matrices. We prove that the expo-nentiated gradient method with Armijo line search always converges to the optimum, if the sequence of the iterates possesses a strictly positive limit point (element-wise for the vector case, and with respect to the Löwner partial ordering for the matrix case). To the best of our knowledge, this is the first convergence result for a mirror descent-type method that only requires differentiability. The proof exploits self-concordant likeness of the l og-partition function, which is of independent interest.


Published in:
Journal of Optimization Theory and Applications
Year:
Dec 03 2018
Keywords:
Laboratories:




 Record created 2018-12-07, last modified 2019-03-17

Fulltext:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)