FedNDA: Enhancing Federated Learning with Noisy Client Detection and Robust Aggregation

Authors

DOI:

https://doi.org/10.4108/eetinis.v12i3.8720

Keywords:

Federated learning, Deep learning, Noisy clients, Non-IID, Class imbalance

Abstract

Federated Learning is a novel decentralized methodology that enables multiple clients to collaboratively train a global model while preserving the privacy of their local data. Although federated learning enhances data privacy, it faces challenges related to data quality and client behavior. A fundamental issue is the presence of noisy labels in certain clients, which damages the global model's performance. To address this problem, this paper introduces a Federated learning framework with Noisy client Detection and robust Aggregation, FedNDA. In the first stage, FedNDA detects noisy clients by analyzing the distribution of their local losses. A noisy client exhibits a loss distribution distinct from that of clean clients. To handle class imbalance issue in local data, we utilize per-class losses instead of the total loss. We then assign each client a noisiness score, calculated as the Earth Mover’s Distance between the per-class loss distribution of the client and the average distribution of all clean clients. This noisiness metric is more sensitive for detecting noisy clients compared to conventional metrics such as Euclidean distance or L1 norm. The noisiness score is subsequently transfered to and used in the server-side aggregation function to prioritize clean clients while reducing the influence of noisy clients. Experimental results demonstrate that FedNDA outperforms FedAvg and FedNoRo by 4.68% and 3.6% on the CIFAR-10 dataset, and by 10.65% and 0.48% on the ICH dataset, respectively, in a high noisy setting.

Downloads

References

[1] McMahan HB, Moore E, Ramage D, y Arcas BA. Federated learning of deep networks using model averaging. arXiv preprint arXiv:160205629. 2016;2(2).

[2] Li T, Sahu AK, Talwalkar A, Smith V. Federated learning: Challenges, methods, and future directions. IEEE signal processing magazine. 2020;37(3):50-60.

[3] Li Q, Wen Z, Wu Z, Hu S, Wang N, Li Y, et al. A survey on federated learning systems: Vision, hype and reality for data privacy and protection. IEEE Transactions on Knowledge and Data Engineering. 2021;35(4):3347-66.

[4] Song H, Kim M, Park D, Shin Y, Lee JG. Learning from noisy labels with deep neural networks: A survey. IEEE transactions on neural networks and learning systems. 2022;34(11):8135-53.

[5] Irvin J, Rajpurkar P, Ko M, Yu Y, Ciurea-Ilcus S, Chute C, et al. Chexpert: A large chest radiograph dataset with uncertainty labels and expert comparison. In: Proceedings of the AAAI conference on artificial intelligence. vol. 33; 2019. p. 590-7.

[6] Karimi D, Dou H, Warfield SK, Gholipour A. Deep learning with noisy labels: Exploring techniques and remedies in medical image analysis. Medical image analysis. 2020;65:101759.

[7] Yiqiang C, Xiaodong Y, Xin Q, Han Y, Biao C, Zhiqi S. FOCUS: Dealing with label quality disparity in federated learning. arXiv preprint arXiv:200111359. 2020.

[8] Tsouvalas V, Saeed A, Ozcelebi T, Meratnia N. Labeling Chaos to Learning Harmony: Federated Learning with Noisy Labels; 2023. Available from: https://arxiv.org/abs/2208.09378.

[9] Lu Y, Chen L, Zhang Y, Zhang Y, Han B, ming Cheung Y, et al.. Federated Learning with Extremely Noisy Clients via Negative Distillation; 2024. Available from: https://arxiv.org/abs/2312.12703.

[10] Xu J, Chen Z, Quek TQS, Chong KFE. FedCorr: Multi-Stage Federated Learning for Label Noise Correction; 2022. Available from: https://arxiv.org/abs/2204.04677.

[11] Li J, Li G, Cheng H, Liao Z, Yu Y. FedDiv: Collaborative Noise Filtering for Federated Learning with Noisy Labels; 2024. Available from: https://arxiv.org/abs/2312.12263.

[12] Giap TT, Kieu TD, Le TL, Tran TH. FedDC: Label Noise Correction With Dynamic Clients for Federated Learning. IEEE Internet of Things Journal. 2024.

[13] Wu N, Yu L, Jiang X, Cheng KT, Yan Z. FedNoRo: Towards Noise-Robust Federated Learning by Addressing Class Imbalance and Label Noise Heterogeneity; 2023. Available from: https://arxiv.org/abs/2305.05230.

[14] Jiang X, Sun S, Li J, Xue J, Li R, Wu Z, et al. Tackling Noisy Clients in Federated Learning with End-to-end Label Correction. In: Proceedings of the33rd ACM International Conference on Information and Knowledge Management. CIKM ’24. ACM; 2024. p. 1015–1026. Available from: http://dx.doi.org/10.1145/3627673.3679550.

[15] Zhang C, Bengio S, Hardt M, Recht B, Vinyals O. Understanding deep learning (still) requires rethinking generalization. Commun ACM. 2021 Feb;64(3):107–115. Available from: https://doi.org/10.1145/3446776.

[16] Frenay B, Verleysen M. Classification in the Presence of Label Noise: A Survey. IEEE Transactions on Neural Networks and Learning Systems. 2014;25(5):845-69.

[17] Han B, Yao Q, Liu T, Niu G, Tsang IW, Kwok JT, et al.. A Survey of Label-noise Representation Learning: Past, Present and Future; 2021. Available from: https://arxiv.org/abs/2011.04406.

[18] Song H, Kim M, Park D, Shin Y, Lee JG. Learning From Noisy Labels With Deep Neural Networks: A Survey. IEEE Transactions on Neural Networks and Learning Systems. 2023;34(11):8135-53.

[19] Yang M, Qian H, Wang X, Zhou Y, Zhu H. Client selection for federated learning with label noise. IEEE Transactions on Vehicular Technology. 2021;71(2):2193-7.

[20] Fang X, Ye M. Robust Federated Learning with Noisy and Heterogeneous Clients. In: 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR); 2022. p. 10062-71.

[21] Kim S, Shin W, Jang S, Song H, Yun SY. FedRN: Exploiting k-reliable neighbors towards robust federated learning. In: Proceedings of the 31st ACM International Conference on Information & Knowledge Management; 2022. p. 972-81.

[22] Krizhevsky A. Learning Multiple Layers of Features from Tiny Images. Master’s thesis, University of Tront. 2009.

[23] Flanders AE, Prevedello LM, Shih G, Halabi SS, Kalpathy-Cramer J, Ball R, et al. Construction of a machine learning dataset through collaboration: the RSNA 2019 brain CT hemorrhage challenge. Radiology: Artificial Intelligence. 2020;2(3):e190211.

[24] He K, Zhang X, Ren S, Sun J. Deep Residual Learning for Image Recognition; 2015. Available from: https://arxiv.org/abs/1512.03385.

[25] Li Q, He B, Song D. Model-contrastive federated learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition; 2021. p. 10713-22.

[26] Menon AK, Jayasumana S, Rawat AS, Jain H, Veit A, Kumar S. Long-tail learning via logit adjustment. arXiv preprint arXiv:200707314. 2020.

[27] Yang S, Park H, Byun J, Kim C. Robust federated learning with noisy labels. IEEE Intelligent Systems. 2022;37(2): 35-43.

[28] Fang X, Ye M. Robust federated learning with noisy and heterogeneous clients. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition; 2022. p. 10072-81.

[29] Jiang X, Sun S, Wang Y, Liu M. Towards federated learning against noisy labels via local self-regularization. In: Proceedings of the 31st ACM International Conference on Information & Knowledge Management; 2022. p. 862-73.

Downloads

Published

03-07-2025

How to Cite

Kieu, T. D., Fonbonne, C., Tran, T.-K., Le, T.-L., Vu, H., Nguyen, H.-T., & Tran, T.-H. (2025). FedNDA: Enhancing Federated Learning with Noisy Client Detection and Robust Aggregation. EAI Endorsed Transactions on Industrial Networks and Intelligent Systems, 12(3). https://doi.org/10.4108/eetinis.v12i3.8720

Funding data

  • MOET
    Grant numbers B2023-BKA-09