FedES: Federated Early-Stopping for Hindering Memorizing Heterogeneous Label Noise

FedES: Federated Early-Stopping for Hindering Memorizing Heterogeneous Label Noise

Bixiao Zeng, Xiaodong Yang, Yiqiang Chen, Zhiqi Shen, Hanchao Yu, Yingwei Zhang

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 5416-5424. https://doi.org/10.24963/ijcai.2024/599

Federated learning (FL) facilitates collaborative model training across distributed clients while maintaining privacy. Federated noisy label learning (FNLL) is more of a challenge for data inaccessibility and noise heterogeneity. Existing works primarily assume clients are either noisy or clean, which may lack the flexibility to adapt to diverse label noise across different clients, especially when entirely clean or noisy clients are not the majority. To address this, we propose a general noise-robust federated learning framework called Federated Early-Stopping (FedES), which adaptively updates critical parameters of each local model based on their noise rates, thereby avoiding overfitting to noisy labels. FedES is composed of two stages: federated noise estimation and parameter-adaptive local updating \& global aggregation. We introduce a signed distance based on local and global gradients during a federated round to estimate clients' noise rates without requiring additional information. Based on this measure, we employ various degrees of early-stopping during local updating on the clients, and further, a noise-aware global aggregation is employed to achieve noise-robust learning. Extensive experiments conducted on varying synthetic and real-world label noise demonstrate the superior performance of FedES over the state-of-the-art methods.
Keywords:
Machine Learning: ML: Federated learning
Machine Learning: ML: Applications
Machine Learning: ML: Weakly supervised learning