Fast Proximal Algorithms For Self-Concordant Function Minimization With Application To Sparse Graph Selection

The convex l(1)-regularized log det divergence criterion has been shown to produce theoretically consistent graph learning. However, this objective function is challenging since the l(1)-regularization is nonsmooth, the log det objective is not globally Lipschitz gradient function, and the problem is high-dimensional. Using the self-concordant property of the objective, we propose a new adaptive step size selection and present the (F)PS ((F)ast Proximal algorithms for Self-concordant functions) algorithmic framework which has linear convergence and exhibits superior empirical results as compared to state-of-the-art first order methods.


Published in:
Proceddings of the 2013 IEEE International Conference On Acoustics, Speech And Signal Processing (ICASSP), 6585-6589
Presented at:
IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Vancouver, BC, Canada, May 26-31, 2013
Year:
2013
Publisher:
New York, IEEE
ISBN:
978-1-4799-0356-6
Keywords:
Laboratories:




 Record created 2014-06-02, last modified 2018-09-13

n/a:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)