Infoscience

Journal article

Generalized Sampling: Stability and Performance Analysis

Generalized sampling provides a general mechanism for recovering an unknown input function $f(x) \in H$ from the samples of the responses of m linear shift-invariant systems sampled at 1 ⁄ mth the reconstruction rate. The system can be designed to perform a projection of f(x) onto the reconstruction subspace $V(\varphi) = span {\varphi(x - k)} _{ k \in Z } $ ; for example, the family of bandlimited signals with $\varphi(x) = sinc(x)$. This implies that the reconstruction will be perfect when the input signal is included in V(φ): the traditional framework of Papoulis' generalized sampling theory. Otherwise, one recovers a signal approximation $ f ^{   } (x) \in V(\varphi)$ that is consistent with f(x) in the sense that it produces the same measurements. To characterize the stability of the algorithm, we prove that the dual synthesis functions that appear in the generalized sampling reconstruction formula constitute a Riesz basis of V(φ), and we use the corresponding Riesz bounds to define the condition number of the system. We then use these results to analyze the stability of various instances of interlaced and derivative sampling. Next, we consider the issue of performance, which becomes pertinent once we have extended the applicability of the method to arbitrary input functions, that is, when H is considerably larger than V(φ), and the reconstruction is no longer exact. By deriving general error bounds for projectors, we are able to show that the generalized sampling solution is essentially equivalent to the optimal minimum error approximation (orthogonal projection), which is generally not accessible. We then perform a detailed analysis for the case in which the analysis filters are in $ L _{ 2 } $ and determine all relevant bound constants explicitly. Finally, we use an interlaced sampling example to illustrate these various calculations.

Related material