Ramezani-Kebrya, AliLiu, FanghuiPethick, Thomas MichaelsenChrysos, GrigoriosCevher, Volkan2023-06-132023-06-132023-06-132023https://infoscience.epfl.ch/handle/20.500.14299/198250This paper addresses intra-client and inter-client covariate shifts in federated learning (FL) with a focus on the overall generalization performance. To handle covariate shifts, we formulate a new global model training paradigm and propose Federated Importance- Weighted Empirical Risk Minimization (FTW-ERM) along with improving density ratio matching methods without requiring perfect knowledge of the supremum over true ratios. We also propose the communication-efficient variant FITW-ERM with the same level of privacy guarantees as those of classical ERM in FL. We theoretically show that FTW-ERM achieves smaller generalization error than classical ERM under certain settings. Experimental results demonstrate the superiority of FTW-ERM over existing FL baselines in challenging imbalanced federated settings in terms of data distribution shifts across clients.ML-AIFederated Learning under Covariate Shifts with Generalization Guaranteestext::journal::journal article::research article