000204772 001__ 204772
000204772 005__ 20190317000102.0
000204772 037__ $$aCONF
000204772 245__ $$aSparsistency of $\ell_1$-Regularized $M$-Estimators
000204772 269__ $$a2015
000204772 260__ $$c2015
000204772 336__ $$aConference Papers
000204772 520__ $$aWe consider the model selection consistency or sparsistency of a broad set of $\ell_1$-regularized $M$-estimators for linear and non-linear statistical models in a unified fashion. For this purpose, we propose the local structured smoothness condition (LSSC) on the loss function. We provide a general result giving deterministic sufficient conditions for sparsistency in terms of the regularization parameter, ambient dimension, sparsity level, and number of measurements. We show that several important statistical models have $M$-estimators that indeed satisfy the LSSC, and as a result, the sparsistency guarantees for the corresponding $\ell_1$-regularized $M$-estimators can be derived as simple applications of our main theorem.
000204772 700__ $$0247574$$g221971$$aLi, Yen-Huan
000204772 700__ $$0248483$$g248798$$aScarlett, Jonathan
000204772 700__ $$aRavikumar, Pradeep
000204772 700__ $$aCevher, Volkan$$g199128$$0243957
000204772 7112_ $$dMay 9-12, 2015$$cSan Diego, California, USA$$aThe 18th International Conference on Artificial Intelligence and Statistics
000204772 8564_ $$uhttps://infoscience.epfl.ch/record/204772/files/sparsistency.pdf$$zPreprint$$s2079556$$yPreprint
000204772 909C0 $$xU12179$$0252306$$pLIONS
000204772 909CO $$qGLOBAL_SET$$pconf$$ooai:infoscience.tind.io:204772$$pSTI
000204772 917Z8 $$x221971
000204772 937__ $$aEPFL-CONF-204772
000204772 973__ $$rREVIEWED$$sACCEPTED$$aEPFL
000204772 980__ $$aCONF