Files

Abstract

The support recovery problem consists of deter- mining a sparse subset of a set of variables that is relevant in generating a set of observations, and arises in a diverse range of settings such as group testing, compressive sensing, and subset selection in regression. In this paper, we provide a unified approach to support recovery problems, considering general probabilistic observation models relating a sparse data vector to an observation vector. We study the information-theoretic limits for both exact and partial support recovery, taking a novel approach motivated by thresholding techniques in channel coding. We provide general achievability and converse bounds characterizing the trade-off between the error probability and number of measurements, and we specialize these bounds the linear and 1-bit compressive sensing models. Our conditions not only provide scaling laws, but also explicit matching or near- matching constant factors. Moreover, our converse results not only provide conditions under which the error probability fails to vanish, but also conditions under which it tends to one.

Details

PDF