Computer Applications and Software, Volume. 42, Issue 4, 319(2025)

FEW-SHOT LEARNING BASED ON KNOWLEDGE DISTILLATION AND TRANSFER LEARNING

Huang Youwen, Hu Yanfang, and Wei Guoqing
Author Affiliations
  • School of Information Engineering, Jiangxi University of Science and Technology, Ganzhou 341000, Jiangxi, China
  • show less

    Aiming at overfitting training data in deep model caused by too few samples, we propose a few-shot learning method that combines knowledge distillation and transfer learning. In order to improve the feature expression ability of shallow network for small sample images, we designed a multi-generation distillation network structure. A modified transfer learning structure was given to enhance the generalization ability of the network by adjusting few parameters. Multiple classifiers were combined to fuse the networks obtained through distillation and transfer. The experiments on three few-shot standard datasets show that the proposed model can effectively improve the classification ability of the model and make the few-shot prediction results more accurate.

    Tools

    Get Citation

    Copy Citation Text

    Huang Youwen, Hu Yanfang, Wei Guoqing. FEW-SHOT LEARNING BASED ON KNOWLEDGE DISTILLATION AND TRANSFER LEARNING[J]. Computer Applications and Software, 2025, 42(4): 319

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Nov. 13, 2021

    Accepted: Aug. 25, 2025

    Published Online: Aug. 25, 2025

    The Author Email:

    DOI:10.3969/j.issn.1000-386x.2025.04.045

    Topics