Journal of Beijing Normal University, Volume. 61, Issue 3, 300(2025)
A knowledge distillation based heterogeneous federated forgetting learning algorithm
To address privacy concerns in federated learning under heterogeneous data environments, we propose a heterogeneous federated unlearning algorithm that integrates knowledge distillation (KD) with a forgetting mechanism. The proposed method extracts generic knowledge from the global model and distills it to local clients, thereby preserving the learning capability of local models, whilst effectively removing information associated with sensitive data, to enable privacy-oriented unlearning. The proposed approach is found to maintain strong model performance across diverse data structures and device configurations, while significantly enhancing data privacy. This study provides an effective technical solution for privacy preservation in heterogeneous federated learning settings.
Get Citation
Copy Citation Text
WANG Yajie, TANG Xiangyun, ZHU Liehuang. A knowledge distillation based heterogeneous federated forgetting learning algorithm[J]. Journal of Beijing Normal University, 2025, 61(3): 300
Received: Apr. 9, 2025
Accepted: Aug. 21, 2025
Published Online: Aug. 21, 2025
The Author Email: TANG Xiangyun (xiangyunt@muc.edu.cn)