Journal of Beijing Normal University, Volume. 61, Issue 3, 300(2025)
A knowledge distillation based heterogeneous federated forgetting learning algorithm
[1] [1] MOOR M, BANERJEE O, ABAD Z S H, et al. Foundation models for generalist medical artificial intelligence[J]. Nature, 2023, 616(7956): 259
[2] [2] HINTON G, VINYALS O, DEAN J. Distilling the knowledge in a neural network[EB/OL]. (2015-03-09) [2025-03-26]. https://arxiv.org/abs/1503.02531
[3] [3] CHEN Z H, QIU G H, LI P, et al. MNGNAS: distilling adaptive combination of multiple searched networks for one-shot neural architecture search[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45(11): 13489
[4] [4] BHARATHI MADAVARAPU J, ISLAM H, APPATHURAI A, et al. Heterogeneous energy harvesting techniques for smart home IoT acceleration[J]. IEEE Access, 2024, 12: 73667
[6] [6] SHUVO M M H, ISLAM S K, CHENG J L, et al. Efficient acceleration of deep learning inference on resource-constrained edge devices: a review[J]. Proceedings of the IEEE, 2023, 111(1): 42
[7] [7] GHIMIRE D, KIL D, KIM S H. A survey on efficient convolutional neural networks and hardware acceleration[J]. Electronics, 2022, 11(6): 945
[8] [8] WANG L, YOON K J. Knowledge distillation and student-teacher learning for visual intelligence: a review and new outlooks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, 44(6): 3048
[9] [9] LI Z H, XU P F, CHANG X J, et al. When object detection meets knowledge distillation: a survey[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45(8): 10555
[11] [11] GOU J P, YU B S, MAYBANK S J, et al. Knowledge distillation: a survey[J]. International Journal of Computer Vision, 2021, 129(6): 1789
[12] [12] YU R N, LIU S H, WANG X C. Dataset distillation: a comprehensive review[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2024, 46(1): 150
[13] [13] CHEN Y J, ZHENG B L, ZHANG Z H, et al. Deep learning on mobile and embedded devices[J]. ACM Computing Surveys, 2021, 53(4): 1
[14] [14] KATARE D, PERINO D, NURMI J, et al. A survey on approximate edge AI for energy efficient autonomous driving services[J]. IEEE Communications Surveys & Tutorials, 2023, 25(4): 2714
[15] [15] TIAN Y J, PEI S C, ZHANG X L, et al. Knowledge distillation on graphs: a survey[J]. ACM Computing Surveys, 2025, 57(8): 1
[18] [18] YE M, FANG X W, DU B, et al. Heterogeneous federated learning: state-of-the-art and research challenges[J]. ACM Computing Surveys, 2024, 56(3): 1
[19] [19] IFTIKHAR S, GILL S S, SONG C H, et al. AI-based fog and edge computing: a systematic review, taxonomy and future directions[J]. Internet of Things, 2023, 21: 100674
[20] [20] PFEIFFER K, RAPP M, KHALILI R, et al. Federated learning for computationally constrained heterogeneous devices: a survey[J]. ACM Computing Surveys, 2023, 55(14s): 1
[21] [21] CHOQUETTE-CHOO C A, DVIJOTHAM K, PILLUTLA K, et al. Correlated noise provably beats independent noise for differentially private learning[EB/OL]. (2024-05-07) [2025-03-26]. https://arxiv.org/abs/2310.06771
[22] [22] XU C H, QU Y Y, XIANG Y, et al. Asynchronous federated learning on heterogeneous devices: a survey[J]. Computer Science Review, 2023, 50: 100595
[23] [23] LEE J, SOLAT F, KIM T Y, et al. Federated learning-empowered mobile network management for 5G and beyond networks: from access to core[J]. IEEE Communications Surveys & Tutorials, 2024, 26(3): 2176
[24] [24] GUENDOUZI B S, OUCHANI S, EL ASSAAD H, et al. A systematic review of federated learning: Challenges, aggregation methods, and development tools[J]. Journal of Network and Computer Applications, 2023, 220: 103714
[25] [25] QI P, CHIARO D, PICCIALLI F. Small models, big impact: a review on the power of lightweight federated learning[J]. Future Generation Computer Systems, 2025, 162: 107484
[26] [26] YANG L X, BELIARD C, ROSSI D. Heterogeneous data-aware federated learning[EB/OL]. (2020-11-12) [2025-03-26]. https://arxiv.org/abs/2011.06393
[27] [27] LIM W Y B, XIONG Z H, NIYATO D, et al. Realizing the metaverse with edge intelligence: a match made in heaven[J]. IEEE Wireless Communications, 2023, 30(4): 64
[28] [28] MU X T, SHEN Y L, CHENG K, et al. FedProc: prototypical contrastive federated learning on non-IID data[J]. Future Generation Computer Systems, 2023, 143: 93
[29] [29] BUYUKATES B, ULUKUS S. Timely communication in federated learning[EB/OL]. [2025-03-26]. https://ieeexplore.ieee.org/abstract/document/9484497
Get Citation
Copy Citation Text
WANG Yajie, TANG Xiangyun, ZHU Liehuang. A knowledge distillation based heterogeneous federated forgetting learning algorithm[J]. Journal of Beijing Normal University, 2025, 61(3): 300
Received: Apr. 9, 2025
Accepted: Aug. 21, 2025
Published Online: Aug. 21, 2025
The Author Email: TANG Xiangyun (xiangyunt@muc.edu.cn)