Fichiers

Résumé

It is well known and surprising that the uncoded transmission of an independent and identically distributed Gaussian source across an additive white Gaussian noise channel is optimal: No amount of sophistication in the coding strategy can ever perform better. What makes uncoded transmission optimal? In this thesis, it is shown that the optimality of uncoded transmission can be understood as the perfect match of four involved measures: the probability distribution of the source, its distortion measure, the conditional probability distribution of the channel, and its input cost function. More generally, what makes a source-channel communication system optimal? Inspired by, and in extension of, the results about uncoded transmission, this can again be understood as the perfect match, now of six quantities: the above, plus the encoding and the decoding functions. The matching condition derived in this thesis is explicit and closed-form. This fact is exploited in various ways, for example to analyze the optimality of source-channel coding systems of finite block length, and involving feedback. In the shape of an intermezzo, the potential impact of our findings on the understanding of biological communication is outlined: owing to its simplicity, uncoded transmission must be an interesting strategy, e.g., for neural communication. The matching condition of this thesis shows that, apart from being simple, uncoded transmission may also be information-theoretically optimal. Uncoded transmission is also a useful point of view in network information theory. In this thesis, it is used to determine network source-channel communication results, including a single-source broadcast scenario, to establish capacity results for Gaussian relay networks, and to give a new example of the fact that separate source and channel coding does not lead to optimal performance in general networks.

Détails

Actions

Aperçu