This paper investigates the construction of determin- istic measurement matrices preserving the entropy of a random vector with a given probability distribution. In particular, it is shown that for a random vector with i.i.d. discrete components, this is achieved by selecting a subset of rows of a Hadamard matrix such that (i) the selection is deterministic (ii) the fraction of selected rows is vanishing. In contrast, it is shown that for a random vector with i.i.d. continuous components, no entropy preserving measurement matrix allows dimensionality reduction. These results are in agreement with the results of Wu-Verdu on almost lossless analog compression. This paper is however motivated by the complexity attribute of Hadamard matrices, which allows the use of efficient and stable reconstruction algo- rithms. The proof technique is based on a polar code martingale argument and on a new entropy power inequality for integer- valued random variables.

Published in:
Information Theory Proceedings (ISIT), IEEE International Symposium on Information Theroy 2012, 1842-1846
Presented at:
IEEE International Symposium on Information Theory (ISIT'12), MIT, Cambridge, MA, USA, July 2-6, 2012
Year:
2012
Publisher:
New York, Ieee
ISBN:
978-1-4673-2579-0
Keywords:
Laboratories: