In the past few years, fusing NIR and color images has been explored in general computational photography and computer vision tasks, where traditionally only color images are used. The additional information provided by the differences of light and scene reflections in the visible and NIR bands of the electromagnetic spectrum is used in several applications such as image denoising, image dehazing, shadow detection and removal, and high dynamic-range imaging. In this thesis, we study a system that simultaneously captures color and NIR images on a single silicon sensor. Such a camera could be manufactured with minor changes in the hardware of consumer color cameras and it could be integrated inside small devices such as cell phones. We address two main challenges in color and NIR acquisition. First, we study the spatial and spectral sampling of the scene, which is inevitable in single-sensor acquisition of multiple spectral channels. We then focus on chromatic aberration distortions. Similarly to color imaging, we use a color filter array (CFA) to sample the scene in visible and NIR bands. We address two main challenges regarding the CFA: (1) designing the CFA, and (2) developing a demosaicing algorithm to reconstruct full-resolution images from the subsampled measurements. We consider a general CFA with filters that transmit different mixtures of color and NIR channels. We develop a framework that, by exploiting the spatial and spectral correlations of color and NIR images, computes the transmittance of each filter and the demosaicing matrix. Our optimized CFA and demosaicing outperform other solutions developed for single-sensor color and NIR acquisition. We also investigate a CFA that is formed by one blue, one green, one red, and one NIR-pass filter. We call it the RGBN CFA and assume that, similarly to color cameras, it uses dye filters that do not have sharp cut-offs. Hence, color and NIR radiations leak into NIR and color filters, respectively. We devise an algorithm that reconstructs full-resolution images from mixed and subsampled sensor measurements. The RGBN CFA and our reconstruction algorithm perform as well as or even better than other single-sensor acquisition techniques that use more complicated hardware components. The problem of chromatic aberration is caused by deficiencies of optical elements. A simple lens converges light rays with different wavelengths at different distances from the lens. Hence, if the color image is in focus and sharp on the sensor plane, the NIR image captured with the same focus settings is out of focus and blurred. We propose an algorithm that retrieves the lost details in NIR using the gradients of the sharp color image. As the high-frequency details of color and NIR images are not strongly correlated in all image patches, our method locally adapts the contribution of color gradients in deblurring. To achieve this, we develop a multiscale scheme that iterates between deblurring NIR and estimating the correlation between color and NIR high-frequency components. Our algorithm outperforms both blind and guided deblurring approaches. We also design a method that estimates a dense blur-kernel map when the severity of chromatic aberration changes as the depths of objects vary across the image. Our method performs better than the competing methods both in estimating the blur-kernel map and in deblurring.