Abstract

The field of Compressed Sensing has shown that a relatively small number of random projections provide sufficient information to accurately reconstruct sparse signals. Inspired by applications in sensor networks in which each sensor is likely to observe a noisy version of a sparse signal and subsequently add sampling error through computation and communication, we investigate how the distortion differs depending on whether noise is introduced before sampling (observation error) or after sampling (sampling error). We analyze the optimal linear estimator (for known support) and an l(1) constrained linear inverse (for unknown support). In both cases, observation noise is shown to be less detrimental than sampling noise and low sampling rates. We also provide sampling bounds for a nonstochastic l(infinity) bounded noise model.

Details

Actions