Opto-Electronic Engineering, Volume. 50, Issue 4, 220232(2023)

Few-shot image classification via multi-scale attention and domain adaptation

Long Chen1...2, Jianlin Zhang1,*, Hao Peng1,2, Meihui Li1, Zhiyong Xu1 and Yuxing Wei1 |Show fewer author(s)
Author Affiliations
  • 1Institute of Optics and Electronics, Chinese Academy of Science, Chengdu, Sichuan 610209, China
  • 2School of Electronic, Electrical and Communication Engineering, University of Chinese Academy of Science, Beijing 100049, China
  • show less

    Learning with limited data is a challenging field for computer visual recognition. Prototypes calculated by the metric learning method are inaccurate when samples are limited. In addition, the generalization ability of the model is poor. To improve the performance of few-shot image classification, the following measures are adopted. Firstly, to tackle the problem of limited samples, the masked autoencoder is used to enhance data. Secondly, prototypes are calculated by task-specific features, which are obtained by the multi-scale attention mechanism. The attention mechanism makes prototypes more accurate. Thirdly, the domain adaptation module is added with a margin loss function. The margin loss pushes different prototypes away from each other in the feature space. Sufficient margin space improves the generalization performance of the method. The experimental results show the proposed method achieves better performance on few-shot classification.

    Tools

    Get Citation

    Copy Citation Text

    Long Chen, Jianlin Zhang, Hao Peng, Meihui Li, Zhiyong Xu, Yuxing Wei. Few-shot image classification via multi-scale attention and domain adaptation[J]. Opto-Electronic Engineering, 2023, 50(4): 220232

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Article

    Received: Sep. 22, 2022

    Accepted: Dec. 29, 2022

    Published Online: Jun. 15, 2023

    The Author Email: Zhang Jianlin (jlin_zh@163.com)

    DOI:10.12086/oee.2023.220232

    Topics