Abstract

In this paper, we consider decentralized optimization problems where agents have individual cost functions to minimize subject to subspace constraints that require the minimizers across the network to lie in low-dimensional subspaces. This constrained formulation includes consensus optimization as special case, and allows for more general task relatedness models such as multitask smoothness and coupled optimization. In order to cope with communication constraints, we propose and study a quantized differential based approach where the communicated estimates among agents are quantized. The analysis shows that, under some general conditions on the quantization noise, and for sufficiently small step-sizes mu, the strategy is stable in the mean-square error sense. The analysis also reveals the influence of the gradient and quantization noises on the performance.

Details

Actions