Chinese Journal of Liquid Crystals and Displays, Volume. 38, Issue 11, 1590(2023)

FRKDNet:feature refine semantic segmentation network based on knowledge distillation

Shi-yi JIANG1, Yang XU1,2、*, Dan-yang LI1, and Run-ze FAN1
Author Affiliations
  • 1College of Big Data and Information Engineering,Guizhou University,Guiyang 550025,China
  • 2Guiyang Aluminum-magnesium Design and Research Institute Co.Ltd.,Guiyang 550009,China
  • show less

    The traditional semantic segmentation knowledge distillation schemes still have problems such as incomplete distillation and insignificant feature information transmission which affect the performance of network, and the complex situation of knowledge transferred by teachers' network which makes it easy to lose the location information of feature. To solve these problems, this paper presents feature refine semantic segmentation network based on knowledge distillation. Firstly, a feature extraction method is designed to separate the foreground content and background noise in the distilled knowledge, and the pseudo knowledge of the teacher network is filtered out to pass more accurate feature content to the student network, so as to improve the performance of the feature. At the same time, the inter-class distance and intra-class distance are extracted in the implicit encoding of the feature space to obtain the corresponding feature coordinate mask. Then, the student network minimizes the output of the feature location with the teacher network by simulating the feature location information, and calculates the distillation loss with the student network respectively, so as to improve the segmentation accuracy of the student network and assist the student network to converge faster. Finally, excellent segmentation performance is achieved on the public datasets Pascal VOC and Cityscapes, and the MIoU reaches 74.19% and 76.53% respectively, which is 2.04% and 4.48% higher than that of the original student network. Compared with the mainstream methods, the method in this paper has better segmentation performance and robustness, and provides a new method for semantic segmentation knowledge distillation.

    Tools

    Get Citation

    Copy Citation Text

    Shi-yi JIANG, Yang XU, Dan-yang LI, Run-ze FAN. FRKDNet:feature refine semantic segmentation network based on knowledge distillation[J]. Chinese Journal of Liquid Crystals and Displays, 2023, 38(11): 1590

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Research Articles

    Received: Jan. 10, 2023

    Accepted: --

    Published Online: Nov. 29, 2023

    The Author Email: Yang XU (xuy@gzu.edu.cn)

    DOI:10.37188/CJLCD.2023-0010

    Topics