A Fast Hadamard Transform for Signals with Sub-linear Sparsity in the Transform Domain

In this paper, we design a new iterative low-complexity algorithm for computing the Walsh-Hadamard transform (WHT) of an N dimensional signal with a K-sparse WHT. We suppose that N is a power of two and K = O(N^α), scales sub-linearly in N for some α ∈ (0,1). Assuming a random support model for the nonzero transform-domain components, our algorithm reconstructs the WHT of the signal with a sample complexity O(K log_2(N/K)) and a computational complexity O(K log_2(K) log_2(N/K)). Moreover, the algorithm succeeds with a high probability approaching 1 for large dimension N. Our approach is mainly based on the subsampling (aliasing) property of the WHT, where by a carefully designed subsampling of the time-domain signal, a suitable aliasing pattern is induced in the transform domain. We treat the resulting aliasing patterns as parity-check constraints and represent them by a bipartite graph. We analyze the properties of the resulting bipartite graphs and borrow ideas from codes defined over sparse bipartite graphs to formulate the recovery of the nonzero spectral values as a peeling decoding algorithm for a specific sparse-graph code transmitted over a binary erasure channel (BEC). This enables us to use tools from coding theory (belief-propagation analysis) to characterize the asymptotic performance of our algorithm in the very sparse (α ∈ (0,1/3]) and the less sparse (α ∈ (1/3,1)) regime. Comprehensive simulation results are provided to assess the empirical performance of the proposed algorithm.

Publié dans:
IEEE Transactions on Information Theory, 61, 4, 2115 - 2132
Piscataway, Institute of Electrical and Electronics Engineers

 Notice créée le 2015-02-05, modifiée le 2018-03-18

Télécharger le documentPDF
Publisher's version:
Télécharger le documentPDF
Reproducible Research material (code).:
Télécharger le documentZIP
Liens externes:
Télécharger le documentURL
Télécharger le documentURL
Évaluer ce document:

Rate this document:
(Pas encore évalué)