Paper ID | D1-S7-T3.3 |
Paper Title |
FedADC: Accelerated Federated Learning with Drift Control |
Authors |
Emre Ozfatura, Imperial College London, United Kingdom; Kerem Ozfatura, Ozyegin University, Turkey; Deniz Gunduz, Imperial College London, United Kingdom |
Session |
D1-S7-T3: Federated Learning |
Chaired Session: |
Tuesday, 13 July, 00:00 - 00:20 |
Engagement Session: |
Tuesday, 13 July, 00:20 - 00:40 |
Abstract |
Federated learning (FL) has become de facto framework for collaborative learning among edge devices with privacy concern. The core of the FL strategy is the use of stochastic gradient descent (SGD) in a distributed manner. Large scale implementation of FL brings new challenges, such as the incorporation of acceleration techniques designed for SGD into the distributed setting, and mitigation of the drift problem due to non-homogeneous distribution of local datasets. These two problems have been separately studied in the literature; whereas, in this paper, we show that it is possible to address both problems using a single strategy without any major alteration to the FL framework, or introducing additional computation and communication load. To achieve this goal, we propose FedADC, which is an accelerated FL algorithm with drift control. We empirically illustrate the advantages of FedADC.
|