Fichiers

Résumé

In this paper we consider recovery of a high dimensional data matrix from a set of incomplete and noisy linear measurements. We introduce a new model which can efficiently restricts the degrees of freedom of data and, at the same time, is generic so that finds varieties of applications, namely, in multichannel signal compressed sensing (e.g. sensor networks, hyperspectral imaging) and compressive sparse principal component analysis (s-PCA). We assume data matrices to have a simultaneous low-rank and joint sparse structure and based on this, we propose a novel approach for an efficient compressed sensing (CS) of data of such kind. Our CS recovery approach is based on convex minimization which incorporates such restrictive structure by jointly regularizing the solutions with their nuclear (trace) norm and the l2/l1 mixed norm. Our theoretical analysis applies a new notion of the restricted isometry property (RIP), and indicates that for sampling schemes satisfying RIP, our approach can stably recover all low-rank and joint-sparse matrices. For a certain class of random sampling schemes satisfying a particular concentration bound (e.g. the subgaussian ensembles) we derive a lower bound on the number of CS measurements indicating the near-optimality of our recovery approach as well as a significant enhancement compared to the state-of-the-art. We introduce an iterative algorithm based on the proximal calculus in order to solve the joint nuclear and l2/l1 norms minimization problem and finally, by series of numerical experiments we demonstrate the empirical recovery phase transition behavior of this approach.

Détails

Actions

Aperçu