Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Iterative pre-conditioning for expediting the distributed gradient-descent method: The case of linear least-squares problem
 
research article

Iterative pre-conditioning for expediting the distributed gradient-descent method: The case of linear least-squares problem

Chakrabarti, Kushal
•
Gupta, Nirupam  
•
Chopra, Nikhil
March 1, 2022
Automatica

This paper considers the multi-agent linear least-squares problem in a server-agent network architecture. The system comprises multiple agents, each with a set of local data points. The agents are connected to a server, and there is no inter-agent communication. The agents' goal is to compute a linear model that optimally fits the collective data. The agents, however, cannot share their data points. In principle, the agents can solve this problem by collaborating with the server using the server-agent network variant of the classical gradient-descent method. However, when the data points are ill-conditioned, the gradient-descent method requires a large number of iterations to converge. We propose an iterative pre-conditioning technique to mitigate the deleterious impact of the data points' conditioning on the convergence rate of the gradient-descent method. Unlike the conventional preconditioning techniques, the pre-conditioner matrix used in our proposed technique evolves iteratively. We show that our proposed algorithm converges linearly with an improved rate of convergence in comparison to both the classical and the accelerated gradient-descent methods. For the special case, when the solution of the least-squares problem is unique, our algorithm converges to the solution superlinearly. Through numerical experiments on benchmark least-squares problems, we validate our theoretical findings, and also demonstrate our algorithm's improved robustness against process noise. (C) 2021 Elsevier Ltd. All rights reserved.

  • Details
  • Metrics
Type
research article
DOI
10.1016/j.automatica.2021.110095
Web of Science ID

WOS:000794995800007

Author(s)
Chakrabarti, Kushal
Gupta, Nirupam  
Chopra, Nikhil
Date Issued

2022-03-01

Publisher

PERGAMON-ELSEVIER SCIENCE LTD

Published in
Automatica
Volume

137

Article Number

110095

Subjects

Automation & Control Systems

•

Engineering, Electrical & Electronic

•

Engineering

•

optimization algorithms

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
DCL  
Available on Infoscience
June 6, 2022
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/188310
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés