Phase Transitions for Mutual Information
We consider ensembles of binary linear error correcting codes, obtained by sampling each column of the generator matrix G or parity check matrix H independently from the set of all binary vectors of weight d (of appropriate dimension). We investigate the circumstances under which the mutual information between a randomly chosen codeword and the vector obtained after its transmission over a binary input memoryless symmetric channel (BIMSC) C is exactly n times the capacity of C, where n is the length of the code. For several channels such as the binary symmetric channel (BSC) and the binary-input additive white Gaussian noise (AWGN) channel, we prove that the probability of this event has a threshold behaviour, depending on whether n/k is smaller than a certain quantity (that depends on the particular channel C and d), where k is the number of source bits. To show this, we prove a generalization of the following well-known theorem: the expectation of the size of the right kernel of G has a phase transition from 1 to infinity, depending on whether or not n/k is smaller than a certain quantity depending on the chosen ensemble.
Phase Transition for Mutual Information.pdf
Postprint
openaccess
318.83 KB
Adobe PDF
f5dccc6a14ae6de4795774ef5fa9169a