Journal of Electronic Science and Technology, Volume. 22, Issue 3, 100278(2024)

De-biased knowledge distillation framework based on knowledge infusion and label de-biasing techniques

Yan Li1、*, Tai-Kang Tian2, Meng-Yu Zhuang2, and Yu-Ting Sun3、*
Author Affiliations
  • 1School of Economics and Management, University of Electronic Science and Technology of China, Chengdu, 611731, China
  • 2School of Economics and Management, Beijing University of Posts and Telecommunication, Beijing, 100876, China
  • 3School of Electrical Engineering and Computer Science, The University of Queensland, Brisbane, 4072, Australia
  • show less
    Figures & Tables(6)
    Illustration of the classical knowledge distillation (KD) and our de-biased knowledge distillation (DeBKD).
    Workflow of the proposed de-biased knowledge distillation framework.
    Process of knowledge infusion.
    • Table 1. Comparison of distillation effects across different datasets.

      View table
      View in Article

      Table 1. Comparison of distillation effects across different datasets.

      MethodD&CC&HC&B
      Acc (%)Pre (%)Rec (%)Acc (%)Pre (%)Rec (%)Acc (%)Pre (%)Rec (%)
      Teacher84.9884.2284.2290.0390.1390.3893.2393.4093.26
      KD-1882.6482.6483.3388.0989.5897.3291.3290.8491.35
      KD-879.5480.0979.5483.7884.6883.0782.9486.9382.77
      FKD-1882.9882.8883.0588.7489.9487.6791.7991.8791.65
      FKD-879.4479.4779.3686.4286.5686.3684.2184.3384.13
      DeBKD-1883.1283.5883.1289.7889.8589.5793.3493.8593.31
      DeBKD-880.8080.8180.8086.7387.4886.1589.0690.2388.97
    • Table 2. Impact of different learning freedom on the classification accuracy.

      View table
      View in Article

      Table 2. Impact of different learning freedom on the classification accuracy.

      Learning freedomD&CC&HC&B
      DeBKD-8DeBKD-18DeBKD-8DeBKD-18DeBKD-8DeBKD-18
      1/878.9879.4270.6285.2182.8188.54
      1/1079.4882.0883.7186.7386.7991.02
      1/1280.8083.1284.3889.7889.0693.36
      1/1479.6480.3686.7385.8185.4791.15
      1/1677.4078.4885.0584.2983.8591.15
    • Table 3. Impact of knowledge infusion (KI) on model performance (Δ indicates the change in performance).

      View table
      View in Article

      Table 3. Impact of knowledge infusion (KI) on model performance (Δ indicates the change in performance).

      MethodKIDataset
      D&CC&HC&B
      Acc (%)Δ (%)Acc (%)Δ (%)Acc (%)Δ (%)
      KD×79.5483.7882.94
      79.86+0.3284.64+0.8683.61+0.67
      FKD×79.4486.4284.21
      79.97+0.5386.77+0.3585.02+0.81
      DeBKD×79.6486.7387.11
      80.80+1.1687.75+1.4489.06+1.95
    Tools

    Get Citation

    Copy Citation Text

    Yan Li, Tai-Kang Tian, Meng-Yu Zhuang, Yu-Ting Sun. De-biased knowledge distillation framework based on knowledge infusion and label de-biasing techniques[J]. Journal of Electronic Science and Technology, 2024, 22(3): 100278

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Jun. 6, 2024

    Accepted: Aug. 15, 2024

    Published Online: Oct. 11, 2024

    The Author Email: Li Yan (yanli@std.uestc.edu.cn), Sun Yu-Ting (skye.sun@uq.edu.au)

    DOI:10.1016/j.jnlest.2024.100278

    Topics