Compressive sensing meets game theory
We introduce the Multiplicative Update Selector and Estimator (MUSE) algorithm for sparse approximation in under-determined linear regression problems. Given ƒ = Φα* + μ, the MUSE provably and efficiently finds a k-sparse vector α̂ such that ∥Φα̂ − ƒ∥∞ ≤ ∥μ∥∞ + O ( 1 over √k), for any k-sparse vector α*, any measurement matrix Φ, and any noise vector μ. We cast the sparse approximation problem as a zero-sum game over a properly chosen new space; this reformulation provides salient computational advantages in recovery. When the measurement matrix Φ provides stable embedding to sparse vectors (the so-called restricted isometry property in compressive sensing), the MUSE also features guarantees on ∥α* − α̂∥2. Simulation results demonstrate the scalability and performance of the MUSE in solving sparse approximation problems based on the Dantzig Selector.
05947144.pdf
openaccess
261.59 KB
Adobe PDF
71df30abf6ae44e9736171f2d2b1070c