Computer Applications and Software, Volume. 42, Issue 4, 319(2025)
FEW-SHOT LEARNING BASED ON KNOWLEDGE DISTILLATION AND TRANSFER LEARNING
Aiming at overfitting training data in deep model caused by too few samples, we propose a few-shot learning method that combines knowledge distillation and transfer learning. In order to improve the feature expression ability of shallow network for small sample images, we designed a multi-generation distillation network structure. A modified transfer learning structure was given to enhance the generalization ability of the network by adjusting few parameters. Multiple classifiers were combined to fuse the networks obtained through distillation and transfer. The experiments on three few-shot standard datasets show that the proposed model can effectively improve the classification ability of the model and make the few-shot prediction results more accurate.
Get Citation
Copy Citation Text
Huang Youwen, Hu Yanfang, Wei Guoqing. FEW-SHOT LEARNING BASED ON KNOWLEDGE DISTILLATION AND TRANSFER LEARNING[J]. Computer Applications and Software, 2025, 42(4): 319
Category:
Received: Nov. 13, 2021
Accepted: Aug. 25, 2025
Published Online: Aug. 25, 2025
The Author Email: