Abstract

Federated learning (FL) is very appealing for its privacy benefits: essentially, a global model is trained with updates computed on mobile devices while keeping the data of users local. Standard FL infrastructures are however designed to have no energy or performance impact on mobile devices, and are therefore not suitable for applications that require frequent (online) model updates, such as news recommenders.

This article presents FLEET, the first Online FL system, acting as a middleware between the Android operating system and the machine learning application. FLEET combines the privacy of Standard FL with the precision of online learning thanks to two core components: (1) I-Prof, a new lightweight profiler that predicts and controls the impact of learning tasks on mobile devices, and (2) ADASGD, a new adaptive learning algorithm that is resilient to delayed updates.

Our extensive evaluation shows that Online FL, as implemented by FLEET, can deliver a 2.3x quality boost compared to Standard FL while only consuming 0.036% of the battery per day. I-Prof can accurately control the impact of learning tasks by improving the prediction accuracy by up to 3.6x in terms of computation time, and by up to 19x in terms of energy. ADASGD outperforms alternative FL approaches by 18.4% in terms of convergence speed on heterogeneous data.

Details

Actions