Practical Hybrid Gradient Compression for Federated Learning Systems

Practical Hybrid Gradient Compression for Federated Learning Systems

Sixu Hu, Linshan Jiang, Bingsheng He

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 4147-4155. https://doi.org/10.24963/ijcai.2024/458

The high communication cost is a major challenge in the federated learning (FL) training process. Several methods have been proposed to reduce communication costs on the uplink channel, primarily sparsification-based methods, which have overlooked the impact of downlink channels. However, model accuracy and communication cost issues arise when applying them in practical FL applications, especially when the bandwidth is limited both on the uplink and downlink channels. In this paper, we propose a novel secure-FL-compatible hybrid gradient compression framework (HGC) that handles both uplink and downlink communication. Specifically, HGC identifies and exploits three types of redundancies in the FL training process. With proposed optimization methods based on compression ratio correction and dynamic momentum correction, HGC improves the trade-off between communication cost and model performance. The extensive theoretical and empirical analysis demonstrates the effectiveness of our framework in achieving a high compression ratio for both uplink and downlink communications with negligible loss of model accuracy, surpassing the state-of-the-art compression methods.
Keywords:
Machine Learning: ML: Federated learning