Continual Federated Learning Based on Knowledge Distillation
Continual Federated Learning Based on Knowledge Distillation
Yuhang Ma, Zhongle Xie, Jue Wang, Ke Chen, Lidan Shou
Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 2182-2188.
https://doi.org/10.24963/ijcai.2022/303
Federated learning (FL) is a promising approach for learning a shared global model on decentralized data owned by multiple clients without exposing their privacy. In real-world scenarios, data accumulated at the client-side varies in distribution over time. As a consequence, the global model tends to forget the knowledge obtained from previous tasks while learning new tasks, showing signs of "catastrophic forgetting". Previous studies in centralized learning use techniques such as data replay and parameter regularization to mitigate catastrophic forgetting. Unfortunately, these techniques cannot adequately solve the non-trivial problem in FL. We propose Continual Federated Learning with Distillation (CFeD) to address catastrophic forgetting under FL. CFeD performs knowledge distillation on both the clients and the server, with each party independently having an unlabeled surrogate dataset, to mitigate forgetting. Moreover, CFeD assigns different learning objectives, namely learning the new task and reviewing old tasks, to different clients, aiming to improve the learning ability of the model. The results show that our method performs well in mitigating catastrophic forgetting and achieves a good trade-off between the two objectives.
Keywords:
Data Mining: Federated Learning
Machine Learning: Incremental Learning