000204772 001__ 204772
000204772 005__ 20180913062939.0
000204772 037__ $$aCONF
000204772 245__ $$aSparsistency of $\ell_1$-Regularized $M$-Estimators
000204772 269__ $$a2015
000204772 260__ $$c2015
000204772 336__ $$aConference Papers
000204772 520__ $$aWe consider the model selection consistency or sparsistency of a broad set of $\ell_1$-regularized $M$-estimators for linear and non-linear statistical models in a unified fashion. For this purpose, we propose the local structured smoothness condition (LSSC) on the loss function. We provide a general result giving deterministic sufficient conditions for sparsistency in terms of the regularization parameter, ambient dimension, sparsity level, and number of measurements. We show that several important statistical models have $M$-estimators that indeed satisfy the LSSC, and as a result, the sparsistency guarantees for the corresponding $\ell_1$-regularized $M$-estimators can be derived as simple applications of our main theorem.
000204772 700__ $$0247574$$aLi, Yen-Huan$$g221971
000204772 700__ $$0248483$$aScarlett, Jonathan$$g248798
000204772 700__ $$aRavikumar, Pradeep
000204772 700__ $$0243957$$aCevher, Volkan$$g199128
000204772 7112_ $$aThe 18th International Conference on Artificial Intelligence and Statistics$$cSan Diego, California, USA$$dMay 9-12, 2015
000204772 8564_ $$s2079556$$uhttps://infoscience.epfl.ch/record/204772/files/sparsistency.pdf$$yPreprint$$zPreprint
000204772 909C0 $$0252306$$pLIONS$$xU12179
000204772 909CO $$ooai:infoscience.tind.io:204772$$pconf$$pSTI
000204772 917Z8 $$x221971
000204772 937__ $$aEPFL-CONF-204772
000204772 973__ $$aEPFL$$rREVIEWED$$sACCEPTED
000204772 980__ $$aCONF