000199329 001__ 199329
000199329 005__ 20190316235920.0
000199329 020__ $$a978-1-4799-0356-6
000199329 0247_ $$2doi$$a10.1109/ICASSP.2013.6638935
000199329 02470 $$2ISI$$a000329611506150
000199329 037__ $$aCONF
000199329 245__ $$aFast Proximal Algorithms For Self-Concordant Function Minimization With Application To Sparse Graph Selection
000199329 269__ $$a2013
000199329 260__ $$bIEEE$$c2013$$aNew York
000199329 300__ $$a5
000199329 336__ $$aConference Papers
000199329 520__ $$aThe convex l(1)-regularized log det divergence criterion has been shown to produce theoretically consistent graph learning. However, this objective function is challenging since the l(1)-regularization is nonsmooth, the log det objective is not globally Lipschitz gradient function, and the problem is high-dimensional. Using the self-concordant property of the objective, we propose a new adaptive step size selection and present the (F)PS ((F)ast Proximal algorithms for Self-concordant functions) algorithmic framework which has linear convergence and exhibits superior empirical results as compared to state-of-the-art first order methods.
000199329 6531_ $$aSparse inverse covariance estimation
000199329 6531_ $$aself-concordance
000199329 6531_ $$astep size selection
000199329 700__ $$0245321$$g199236$$aKyrillidis, Anastasios
000199329 700__ $$aCevher, Volkan$$g199128$$0243957
000199329 7112_ $$dMay 26-31, 2013$$cVancouver, BC, Canada$$aIEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP)
000199329 773__ $$tProceddings of the 2013 IEEE International Conference On Acoustics, Speech And Signal Processing (ICASSP)$$q6585-6589
000199329 8564_ $$uhttps://infoscience.epfl.ch/record/199329/files/06638935.pdf$$zn/a$$s350930$$yn/a
000199329 909C0 $$xU12179$$0252306$$pLIONS
000199329 909CO $$ooai:infoscience.tind.io:199329$$qGLOBAL_SET$$pconf$$pSTI
000199329 917Z8 $$x199128
000199329 917Z8 $$x231598
000199329 937__ $$aEPFL-CONF-199329
000199329 973__ $$rREVIEWED$$sPUBLISHED$$aEPFL
000199329 980__ $$aCONF