Fast Proximal Algorithms For Self-Concordant Function Minimization With Application To Sparse Graph Selection

The convex l(1)-regularized log det divergence criterion has been shown to produce theoretically consistent graph learning. However, this objective function is challenging since the l(1)-regularization is nonsmooth, the log det objective is not globally Lipschitz gradient function, and the problem is high-dimensional. Using the self-concordant property of the objective, we propose a new adaptive step size selection and present the (F)PS ((F)ast Proximal algorithms for Self-concordant functions) algorithmic framework which has linear convergence and exhibits superior empirical results as compared to state-of-the-art first order methods.


Publié dans:
Proceddings of the 2013 IEEE International Conference On Acoustics, Speech And Signal Processing (ICASSP), 6585-6589
Présenté à:
IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Vancouver, BC, Canada, May 26-31, 2013
Année
2013
Publisher:
New York, IEEE
ISBN:
978-1-4799-0356-6
Mots-clefs:
Laboratoires:




 Notice créée le 2014-06-02, modifiée le 2018-03-17

n/a:
Télécharger le document
PDF

Évaluer ce document:

Rate this document:
1
2
3
 
(Pas encore évalué)