Files

Résumé

The field of signal processing has known tremendous progress with the The field of signal processing has known tremendous progress with the development of digital signal processing. The first foundation of digital signal processing is due to Shannon's sampling theorem which shows that any bandlimited analog signal can be reduced to a discrete-time signal. However, digital signals assume a second digitization operation in amplitude. While this operation, called quantization, is as deterministic as time sampling, it appears from literature that no strong theory supports its analysis. By tradition, quantization is only approximately modeled as an additive source of uniformly distributed and independent white noise. This chapter proposes a theoretical framework which genuinely treats quantization as a deterministic process, is based on Hilbert space analysis and overcomes some of the limitations of Fourier analysis. While a digital signal is considered as the representation of an approximate signal (the quantized signal), the chapter shows that it is in fact the representation of a deterministic convex set of analog signals in a Hilbert space. The elements of the set are referred to as “the analog estimates consistent with the digital signal.” This view leads to a new framework of signal processing which is non-linear and based on convex projections in Hilbert spaces. This chapter also shows that any discretization operation, not only Analag-to-digital (A/D) conversion but also signal compression, amounts to encoding sets of signals, that is, associating digital signals with sets of analog signals. With this view and the framework presented in the chapter, directions of research for the design of new types of high resolution A/D converters and new signal compression schemes can be proposed.

Détails

PDF