Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Federated Learning Over Wireless Networks: Convergence Analysis and Resource Allocation
 
research article

Federated Learning Over Wireless Networks: Convergence Analysis and Resource Allocation

Dinh, Canh T.
•
Tran, Nguyen H.
•
Nguyen, Minh N. H.
Show more
February 1, 2021
Ieee-Acm Transactions On Networking

There is an increasing interest in a fast-growing machine learning technique called Federated Learning (FL), in which the model training is distributed over mobile user equipment (UEs), exploiting UEs' local computation and training data. Despite its advantages such as preserving data privacy, FL still has challenges of heterogeneity across UEs' data and physical resources. To address these challenges, we first propose FEDL, a FL algorithm which can handle heterogeneous UE data without further assumptions except strongly convex and smooth loss functions. We provide a convergence rate characterizing the trade-off between local computation rounds of each UE to update its local model and global communication rounds to update the FL global model. We then employ FEDL in wireless networks as a resource allocation optimization problem that captures the trade-off between FEDL convergence wall clock time and energy consumption of UEs with heterogeneous computing and power resources. Even though the wireless resource allocation problem of FEDL is non-convex, we exploit this problem's structure to decompose it into three sub-problems and analyze their closed-form solutions as well as insights into problem design. Finally, we empirically evaluate the convergence of FEDL with PyTorch experiments, and provide extensive numerical results for the wireless resource allocation sub-problems. Experimental results show that FEDL outperforms the vanilla FedAvg algorithm in terms of convergence rate and test accuracy in various settings.

  • Details
  • Metrics
Type
research article
DOI
10.1109/TNET.2020.3035770
Web of Science ID

WOS:000619370600029

Author(s)
Dinh, Canh T.
Tran, Nguyen H.
Nguyen, Minh N. H.
Hong, Choong Seon
Bao, Wei
Zomaya, Albert Y.
Gramoli, Vincent  
Date Issued

2021-02-01

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC

Published in
Ieee-Acm Transactions On Networking
Volume

29

Issue

1

Start page

398

End page

409

Subjects

Computer Science, Hardware & Architecture

•

Computer Science, Theory & Methods

•

Engineering, Electrical & Electronic

•

Telecommunications

•

Computer Science

•

Engineering

•

convergence

•

computational modeling

•

training

•

data models

•

resource management

•

wireless communication

•

wireless networks

•

distributed machine learning

•

federated learning

•

optimization decomposition

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
DCL  
Available on Infoscience
March 26, 2021
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/176638
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés