Jafarpour, SinaSchapire, Robert E.Cevher, Volkan2014-12-052014-12-052014-12-05201110.1109/ICASSP.2011.5947144https://infoscience.epfl.ch/handle/20.500.14299/109218We introduce the Multiplicative Update Selector and Estimator (MUSE) algorithm for sparse approximation in under-determined linear regression problems. Given ƒ = Φα* + μ, the MUSE provably and efficiently finds a k-sparse vector α̂ such that ∥Φα̂ − ƒ∥∞ ≤ ∥μ∥∞ + O ( 1 over √k), for any k-sparse vector α*, any measurement matrix Φ, and any noise vector μ. We cast the sparse approximation problem as a zero-sum game over a properly chosen new space; this reformulation provides salient computational advantages in recovery. When the measurement matrix Φ provides stable embedding to sparse vectors (the so-called restricted isometry property in compressive sensing), the MUSE also features guarantees on ∥α* − α̂∥2. Simulation results demonstrate the scalability and performance of the MUSE in solving sparse approximation problems based on the Dantzig Selector.Compressive sensing meets game theorytext::conference output::conference proceedings::conference paper