Spatial Super-Resolution of a Diffusion Field by Temporal Oversampling in Sensor Networks

We study the spatial-temporal sampling of a linear diffusion field, and show that it is possible to compensate for insufficient spatial sampling densities by oversampling in time. Our work is motivated by the following issue often encountered in sensor network sampling, namely increasing the temporal sampling density is often easier and less expensive than increasing the spatial sampling density of the network. For the case of sampling a diffusion field, we show that, to achieve trade-off between spatial and temporal sampling, the spatial arrangement of the sensors must satisfy certain conditions. We provide in this paper the precise relationships between the achievable reduction of spatial sampling density, the required temporal oversampling rate, the spatial arrangement of the sensors, and the bound for the condition numbers of the resulting sampling and reconstruction procedures.


Published in:
Proc. IEEE International Conference on Acoustics, Speech and Signal Processing, 2249-2252
Presented at:
IEEE International Conference on Acoustics, Speech and Signal Processing, Taipei, Taiwan, April 19-24, 2009
Year:
2009
Keywords:
Laboratories:




 Record created 2009-03-02, last modified 2018-03-17

n/a:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)