Infoscience

Conference paper

Fast Proximal algorithms for Self-concordant function minimization with application to sparse graph selection

The convex $\ell_1$-regularized $\log\det$ divergence criterion has been shown to produce theoretically consistent graph learning. However, this objective function is challenging since the $\ell_1$-regularization is nonsmooth, the $\log\det$ objective is not globally Lipschitz gradient function, and the problem is high-dimensional. Using the self-concordant property of the objective, we propose a new adaptive step size selection and present the (F)PS ((F)ast Proximal algorithms for Self-concordant functions) algorithmic framework which has linear convergence and exhibits superior empirical results as compared to state-of-the-art first order methods.

    Reference

    • EPFL-CONF-186829

    Record created on 2013-06-04, modified on 2016-08-09

Related material