Diffusion Moving-Average Adaptation Over Networks
Recently, the diffusion moving-average (D-MA) scheme has been proposed as a way to combat noisy links over adaptive networks. However, the current theoretical results focus on networks with mean-square error costs where the optimal local solution agrees with the global optimal solution. In this paper, we examine the convergence behavior of the first- and second-order error moments of D-MA under general (strong convexity and Lipschitz-continuous) local cost functions with different local optima. One of the main findings is that, for small step-sizes μk and for a forgetting factor within the range [0,1), the D-MA algorithm can approach the optimal solution with arbitrary levels of accuracy. The steady-state error bound derived in this work reveals how the link noise, forgetting factor, and step-size contribute to the algorithm performance. On the basis of these analyses, we propose a global variable forgetting factor (GVFF) scheme for the D-MA. Compared to the existing variable forgetting factor schemes, the designed GVFF is better suited to situations involving different local and global solutions. Finally, numerical simulations are provided to verify the theoretical results, and to compare the proposed scheme against other competing approaches under different types of link noise, including quantization noise, data protection noise, and channel interference noise.
2-s2.0-85198325810
Southwest Jiaotong University
Southwest Jiaotong University
Southwest Jiaotong University
Zhejiang Lab
École Polytechnique Fédérale de Lausanne
2024
72
3393
3407
REVIEWED
EPFL