Abstract

We consider communication over a noisy network under randomized linear network coding. Possible error mechanisms include node-or link-failures, Byzantine behavior of nodes, or an overestimate of the network min-cut. Building on the work of Kotter and Kschischang, we introduce a systematic oblivious random channel model. Within this model, codewords contain a header (this is the systematic part). The header effectively records the coefficients of the linear encoding functions, thus simplifying the decoding task. Under this constraint, errors are modeled as random low-rank perturbations of the transmitted codeword. We compute the capacity of this channel and we define an error-correction scheme based on random sparse graphs and a low-complexity decoding algorithm. By optimizing over the code degree profile, we show that this construction achieves the channel capacity in complexity which is jointly quadratic in the number of coded information bits and sublogarithmic in the error probability.

Details

Actions