Fast Proximal algorithms for Self-concordant function minimization with application to sparse graph selection

The convex $\ell_1$-regularized $\log\det$ divergence criterion has been shown to produce theoretically consistent graph learning. However, this objective function is challenging since the $\ell_1$-regularization is nonsmooth, the $\log\det$ objective is not globally Lipschitz gradient function, and the problem is high-dimensional. Using the self-concordant property of the objective, we propose a new adaptive step size selection and present the (F)PS ((F)ast Proximal algorithms for Self-concordant functions) algorithmic framework which has linear convergence and exhibits superior empirical results as compared to state-of-the-art first order methods.


Published in:
IEEE Proceedings of the 38th International Conference on Acoustics, Speech, and Signal Processing (ICASSP)
Presented at:
38th International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Vancouver, Canada, May 26-31, 2013
Year:
2013
Laboratories:




 Record created 2013-06-04, last modified 2018-03-17

n/a:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)