Files

Abstract

The self-concordant-like property of a smooth convex func- tion is a new analytical structure that generalizes the self-concordant notion. While a wide variety of important applications feature the self- concordant-like property, this concept has heretofore remained unex- ploited in convex optimization. To this end, we develop a variable metric framework of minimizing the sum of a “simple” convex function and a self-concordant-like function. We introduce a new analytic step-size selec- tion procedure and prove that the basic gradient algorithm has improved convergence guarantees as compared to “fast” algorithms that rely on the Lipschitz gradient property. Our numerical tests with real-data sets shows that the practice indeed follows the theory.

Details

Actions

Preview