Journal of Beijing Normal University, Volume. 61, Issue 3, 300(2025)

A knowledge distillation based heterogeneous federated forgetting learning algorithm

WANG Yajie1, TANG Xiangyun2,3、*, and ZHU Liehuang1
Author Affiliations
  • 1School of Cyberspace Security, Beijing Institute of Technology, Beijing, China
  • 2School of Information Engineering, Minzu University of China, Beijing, China
  • 3Qilu University of Technology (Shandong Academy of Sciences), Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Jinan, Shandong, China
  • show less

    To address privacy concerns in federated learning under heterogeneous data environments, we propose a heterogeneous federated unlearning algorithm that integrates knowledge distillation (KD) with a forgetting mechanism. The proposed method extracts generic knowledge from the global model and distills it to local clients, thereby preserving the learning capability of local models, whilst effectively removing information associated with sensitive data, to enable privacy-oriented unlearning. The proposed approach is found to maintain strong model performance across diverse data structures and device configurations, while significantly enhancing data privacy. This study provides an effective technical solution for privacy preservation in heterogeneous federated learning settings.

    Tools

    Get Citation

    Copy Citation Text

    WANG Yajie, TANG Xiangyun, ZHU Liehuang. A knowledge distillation based heterogeneous federated forgetting learning algorithm[J]. Journal of Beijing Normal University, 2025, 61(3): 300

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Received: Apr. 9, 2025

    Accepted: Aug. 21, 2025

    Published Online: Aug. 21, 2025

    The Author Email: TANG Xiangyun (xiangyunt@muc.edu.cn)

    DOI:10.12202/j.0476-0301.2025059

    Topics