Fast Proximal Algorithms For Self-Concordant Function Minimization With Application To Sparse Graph Selection
The convex l(1)-regularized log det divergence criterion has been shown to produce theoretically consistent graph learning. However, this objective function is challenging since the l(1)-regularization is nonsmooth, the log det objective is not globally Lipschitz gradient function, and the problem is high-dimensional. Using the self-concordant property of the objective, we propose a new adaptive step size selection and present the (F)PS ((F)ast Proximal algorithms for Self-concordant functions) algorithmic framework which has linear convergence and exhibits superior empirical results as compared to state-of-the-art first order methods.
06638935.pdf
openaccess
342.71 KB
Adobe PDF
c4b17d9eaee0c7d2261310c0ea3d3a37