000155219 001__ 155219
000155219 005__ 20190316234939.0
000155219 037__ $$aCONF
000155219 245__ $$aFast hard thresholding with Nesterov's gradient method
000155219 269__ $$a2010
000155219 260__ $$c2010
000155219 336__ $$aConference Papers
000155219 520__ $$aWe provide an algorithmic framework for structured sparse recovery which unifies combinatorial optimization with the non-smooth convex optimization framework by Nesterov [1, 2]. Our algorithm, dubbed Nesterov iterative hard-thresholding (NIHT), is similar to the algebraic pursuits (ALPS) in [3] in spirit: we use the gradient information in the convex data error objective to navigate over the non convex set of structured sparse signals. While ALPS feature a priori approximation guarantees, we were only able to provide an online approximation guarantee for NIHT (e.g., the guarantees require the algorithm execution). Experiments show however that NIHT can empirically outperform ALPS and other state-of-the-art convex optimization-based algorithms in sparse recovery.
000155219 700__ $$0243957$$g199128$$aCevher, Volkan
000155219 700__ $$aJafarpour, Sina
000155219 7112_ $$dDecember 2010$$cWhistler, Canada$$aAdvances in Neuronal Information Processing Systems (NIPS) Workshops
000155219 8564_ $$uhttps://infoscience.epfl.ch/record/155219/files/nips2010_1.pdf$$zn/a$$s1366141$$yn/a
000155219 909C0 $$xU12179$$0252306$$pLIONS
000155219 909CO $$ooai:infoscience.tind.io:155219$$qGLOBAL_SET$$pconf$$pSTI
000155219 917Z8 $$x199128
000155219 917Z8 $$x199128
000155219 917Z8 $$x199128
000155219 917Z8 $$x231598
000155219 917Z8 $$x199128
000155219 917Z8 $$x231598
000155219 937__ $$aEPFL-CONF-155219
000155219 973__ $$rNON-REVIEWED$$sPUBLISHED$$aEPFL
000155219 980__ $$aCONF