Files

Abstract

The throughput of wireless networks is known to scale poorly when the number of users grows. The rate at which an arbitrary pair of nodes can communicate must decrease to zero as the number of users tends to infinity, under various assumptions. One of them is the requirement that the network is fully connected: the computed rate must hold for any pair of nodes of the network. We show that this requirement can be responsible for the lack of throughput scalability. We consider a two-dimensional network of extending area with only one active source-destination pair at any given time, and all remaining nodes acting only as possible relays. Allowing an arbitrary small fraction of the nodes to be disconnected, we show that the per-node throughput remains constant as the network size increases. This result relies on percolation theory arguments and does not hold for one-dimensional networks, where a non-vanishing rate is impossible even if we allow an arbitrary large fraction of nodes to be disconnected. A converse bound is obtained using an ergodic property of shot noises. We show that communications occurring at a fixed non-zero rate imply some of the nodes to be disconnected. Our results are of information theoretic flavor, as they hold without assumptions on the communication strategies employed by the network nodes.

Details

Actions

Preview