TY - JOUR
T1 - Continual and wisdom learning for federated learning
T2 - A comprehensive framework for robustness and debiasing
AU - Iqbal, Saeed
AU - Zhong, Xiaopin
AU - Khan, Muhammad Attique
AU - Wu, Zongze
AU - AlHammadi, Dina Abdulaziz
AU - Liu, Weixiang
AU - Choudhry, Imran Arshad
N1 - Publisher Copyright:
© 2025 Elsevier Ltd
PY - 2025/9
Y1 - 2025/9
N2 - Federated Learning (FL) has transformed decentralized machine learning, however it remains has concerns with noisy labeled data, diverse clients, and sparse datasets, especially in delicate fields like healthcare. To address these issues, this study introduces a robust FL framework that integrates advanced Continual Learning (CL) and Wisdom Learning (WL) techniques. Elastic Weight Consolidation (EWC) prevents catastrophic forgetting by penalizing deviations from critical weights, while Progressive Neural Networks (PNN) leverage modular architectures with lateral connections to enable knowledge transfer across tasks and isolate client-specific biases. WL incorporates consensus-based aggregation, dynamic model distillation, and adaptive ensemble learning to enhance model robustness against noisy updates and biased data distributions. The framework is rigorously validated on benchmark medical imaging datasets, including ADNI, BraTS, PathMNIST, BreastMNIST, and ChestMNIST, demonstrating significant improvements in fairness metrics, with up to a 94.3% reduction in bias (Demographic Parity) and a 92.7% improvement in accuracy fairness (Accuracy Parity). These results establish the effectiveness of the proposed approach in achieving stable, equitable, and high-performing global models under challenging FL conditions characterized by dynamic client settings, label noise, and class imbalance.
AB - Federated Learning (FL) has transformed decentralized machine learning, however it remains has concerns with noisy labeled data, diverse clients, and sparse datasets, especially in delicate fields like healthcare. To address these issues, this study introduces a robust FL framework that integrates advanced Continual Learning (CL) and Wisdom Learning (WL) techniques. Elastic Weight Consolidation (EWC) prevents catastrophic forgetting by penalizing deviations from critical weights, while Progressive Neural Networks (PNN) leverage modular architectures with lateral connections to enable knowledge transfer across tasks and isolate client-specific biases. WL incorporates consensus-based aggregation, dynamic model distillation, and adaptive ensemble learning to enhance model robustness against noisy updates and biased data distributions. The framework is rigorously validated on benchmark medical imaging datasets, including ADNI, BraTS, PathMNIST, BreastMNIST, and ChestMNIST, demonstrating significant improvements in fairness metrics, with up to a 94.3% reduction in bias (Demographic Parity) and a 92.7% improvement in accuracy fairness (Accuracy Parity). These results establish the effectiveness of the proposed approach in achieving stable, equitable, and high-performing global models under challenging FL conditions characterized by dynamic client settings, label noise, and class imbalance.
KW - Client heterogeneity
KW - Continual Learning (CL)
KW - Debiasing techniques
KW - Federated Learning (FL)
KW - Label noise mitigation
KW - Model fairness
KW - Wisdom Learning (WL)
UR - https://www.scopus.com/pages/publications/105001875868
U2 - 10.1016/j.ipm.2025.104157
DO - 10.1016/j.ipm.2025.104157
M3 - 文章
AN - SCOPUS:105001875868
SN - 0306-4573
VL - 62
JO - Information Processing and Management
JF - Information Processing and Management
IS - 5
M1 - 104157
ER -