Abstract

We consider distributed detection problems over adaptive networks, where dispersed agents learn continually from streaming data by means of local interactions. The requirement of adaptation allows the network of detectors to track drifts in the underlying hypothesis. The requirement of cooperation allows each agent to deliver a performance superior to what would be obtained if it were acting individually. The simultaneous requirements of adaptation and cooperation are achieved by employing diffusion algorithms with constant step-size μ. By conducting a refined asymptotic analysis based on the mathematical framework of exact asymptotics, we arrive at a revealing understanding of the universal behavior of distributed detection over adaptive networks: as functions of 1/μ, the error (log-)probability curves corresponding to different agents stay nearly-parallel to each other, however, these curves are ordered following a criterion reflecting the degree of connectivity of each agent. Depending on the combination weights, the more connected an agent is, the lower its error probability curve will be. The analysis provides explicit analytical formulas for the detection error probabilities and these expressions are also verified by means of extensive simulations. We further enlarge the reference setting from the case of doubly-stochastic combination matrices considered in [1] and [2] to the more general and demanding setting of right-stochastic combination matrices; this extension poses new and interesting questions in terms of the interplay between the network topology, the combination weights, and the inference performance. The potential of the proposed methods is illustrated by application of the results to canonical detection problems, to typical network topologies, for both doubly-stochastic and right-stochastic combination matrices. Interesting and somehow unexpected behaviors emerge, and the lesson learned is that connectivity matters.

Details

Actions