000229206 001__ 229206
000229206 005__ 20181203024733.0
000229206 037__ $$aARTICLE
000229206 245__ $$aSmoothing technique for nonsmooth composite minimization with linear operator
000229206 269__ $$a2017
000229206 260__ $$c2017
000229206 336__ $$aJournal Articles
000229206 520__ $$aWe introduce and analyze an algorithm for the minimization of convex functions that are the sum of differentiable terms and proximable terms composed with linear operators. The method builds upon the recently developed smoothed gap technique. In addition to a precise convergence rate result, valid even in the presence of linear inclusion constraints, this new method allows an explicit treatment of the gradient of differentiable functions and can be enhanced with line-search. We also study the consequences of restarting the acceleration of the algorithm at a given frequency. These new features are not classical for primal-dual methods and allow us to solve difficult large scale convex optimization problems. We numerically illustrate the superior performance of the algorithm on basis pursuit, TV-regularized least squares regression and L1 regression problems against the state-of-the-art.
000229206 6531_ $$acomposite minimization
000229206 6531_ $$aforward-backward
000229206 6531_ $$amultivariate minimization
000229206 6531_ $$aatomization energies
000229206 700__ $$aNguyen, Quang Van
000229206 700__ $$aFercoq, Olivier
000229206 700__ $$0243957$$aCevher, Volkan$$g199128
000229206 773__ $$tPreprint
000229206 8564_ $$s839781$$uhttps://infoscience.epfl.ch/record/229206/files/cevher_fercoq_nguyen.pdf$$yPreprint$$zPreprint
000229206 909C0 $$0252306$$pLIONS$$xU12179
000229206 909CO $$ooai:infoscience.tind.io:229206$$pSTI$$particle
000229206 917Z8 $$x264095
000229206 917Z8 $$x264095
000229206 917Z8 $$x264095
000229206 917Z8 $$x252028
000229206 917Z8 $$x253580
000229206 937__ $$aEPFL-ARTICLE-229206
000229206 973__ $$aEPFL$$rREVIEWED$$sSUBMITTED
000229206 980__ $$aARTICLE