FedSampling: A Better Sampling Strategy for Federated Learning
FedSampling: A Better Sampling Strategy for Federated Learning
Tao Qi, Fangzhao Wu, Lingjuan Lyu, Yongfeng Huang, Xing Xie
Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 4154-4162.
https://doi.org/10.24963/ijcai.2023/462
Federated learning (FL) is an important technique for learning models from decentralized data in a privacy-preserving way. Existing FL methods usually uniformly sample clients for local model learning in each round. However, different clients may have significantly different data sizes, and the clients with more data cannot have more opportunities to contribute to model training, which may lead to inferior performance. In this paper, instead of client uniform sampling, we propose a novel data uniform sampling strategy for federated learning (FedSampling), which can effectively improve the performance of federated learning especially when client data size distribution is highly imbalanced across clients. In each federated learning round, local data on each client is randomly sampled for local model learning according to a probability based on the server desired sample size and the total sample size on all available clients. Since the data size on each client is privacy-sensitive, we propose a privacy-preserving way to estimate the total sample size with a differential privacy guarantee. Experiments on four benchmark datasets show that FedSampling can effectively improve the performance of federated learning.
Keywords:
Machine Learning: ML: Federated learning
Data Mining: DM: Privacy-preserving data mining