Conventional sampling (Shannon's sampling formulation and its approximation-theoretic counterparts) and interpolation theories provide effective solutions to the problem of reconstructing a signal from its samples, but they are primarily restricted to the noise-free scenario. The purpose of this thesis is to extend the standard techniques so as to be able to handle noisy data. First, we consider a realistic setting where a multidimensional signal is prefiltered prior to sampling, and the samples are corrupted by additive noise. In order to counterbalance the effect of noise, the reconstruction problem is formulated in a variational framework where the solution is obtained by minimizing a continuous-domain Tikhonov-like L2-regularization subject to a ℓp-based data fidelity constraint. We present theoretical justification for the minimization of this cost functional and show that the global-minimum solution belongs to a shift-invariant space generated by a function that is generally not bandlimited. The optimal reconstruction space is characterized by a condition that links the generating function to the regularization operator and implies the existence of a B-spline-like basis. We also consider stochastic formulations – min-max and minimum mean-squared error (MMSE/Wiener) formulations – of the nonideal sampling problem and show that they yield the same type of estimators and point towards the existence of optimal shift-invariant spaces for certain classes of stochastic processes. In the stochastic context, we also derive an exact formula for the error of approximating a stationary stochastic signal in the presence of discrete additive noise and justify the noise-reducing effect of regularization through illustrations. Next, we focus on the use of a much wider class of non-quadratic regularization functionals for the problem of interpolation in the presence of noise. Starting from the afine-invariance of the solution, we show that the Lp-norm (p ≠ 2) is the most suitable type of non-quadratic regularization for our purpose. We give monotonically convergent numerical algorithms to carry out the minimization of the non-quadratic cost criterion. We also demonstrate experimentally that the proposed regularized interpolation scheme provides superior interpolation performance compared to standard methods in the presence of noise. Finally, we address the problem of selecting an appropriate value for the regularization parameter which is most crucial for the working of variational methods in general including those discussed in this thesis. We propose a practical scheme that is based on the concept of risk estimation to achieve minimum MSE performance. In this context, we first review a well known result due to Stein (Stein's unbiased risk estimate — SURE) that is applicable for data corrupted by additive Gaussian noise and also derive a new risk estimate for a Poisson-Gaussian mixture model that is appropriate for certain biomedical imaging applications. Next, we introduce a novel and efficient Monte-Carlo technique to compute SURE for arbitrary nonlinear algorithms. We demonstrate experimentally that the proposed Monte-Carlo SURE yields regularization parameter values that are close to the oracle-optimum (minimum MSE) for all methods considered in this work. We also present results that illustrate the applicability of our technique to a wide variety of algorithms in denoising and deconvolution.