Optics and Precision Engineering, Volume. 31, Issue 21, 3145(2023)

Intra-inter channel attention for few-shot classification

Liping YANG1、*, Tianyang ZHANG1, Yuyang WANG1, and Xiaohua GU2
Author Affiliations
  • 1Key Laboratory of Optoelectronic Technique & System of Ministry of Education, Chongqing University, Chongqing400044, China
  • 2School of Electrical Engineering, Chongqing University of Science & Technology, Chongqing401331, China
  • show less
    References(25)

    [1] JIANG K, ZHU L, SUN Q D. Joint dual-structural constrained and non-negative analysis representation learning for pattern classification[J]. Applied Artificial Intelligence, 37, 2180821(2023).

    [2] AGARWAL N, SONDHI A, CHOPRA K et al. Transfer Learning Survey and Classification[M]. Smart Innovations in Communication and Computational Sciences, 145-155(2020).

    [3] LI X, SUN Z, XUE J H et al. A concise review of recent few-shot meta-learning methods[J]. Neurocomputing, 456, 463-468(2021).

    [6] HOSPEDALES T, ANTONIOU A, MICAELLI P et al. Meta-learning in neural networks: a survey[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44, 5149-5169(2022).

    [7] HUISMAN M, RIJN J N, PLAAT A. A survey of deep meta-learning[J]. Artificial Intelligence Review, 54, 4483-4541(2021).

    [8] SNELL J, SWERSKY K, ZEMEL R. Prototypical networks for few-shot learning[C], 4080-4090(2017).

    [9] YAN S P, ZHANG S Y, HE X M. A dual attention network with semantic embedding for few-shot learning[J]. Proceedings of the AAAI Conference on Artificial Intelligence, 33, 9079-9086(2019).

    [10] LEE S, MOON W, HEO J P. Task discrepancy maximization for fine-grained few-shot classification[C], 5321-5330(2022).

    [11] GIDARIS S, BURSUC A, KOMODAKIS N et al. Boosting few-shot visual learning with self-supervision[C], 8058-8067(2019).

    [12] HU J, SHEN L, SUN G. Squeeze-and-excitation networks[C], 7132-7141(2018).

    [13] GUO M H, XU T X, LIU J J et al. Attention mechanisms in computer vision: a survey[J]. Computational Visual Media, 8, 331-368(2022).

    [14] VINYALS O, BLUNDELL C, LILLICRAP T et al. Matching networks for one shot learning[C], 3637-3645(2016).

    [15] RAVI S, LAROCHELLE H. Optimization as a model for few-shot learning[C](2017).

    [16] FINN C, ABBEEL P, LEVINE S. Model-agnostic meta-learning for fast adaptation of deep networks[C], 1126-1135(11).

    [17] LIU X, ZHOU F, LIU J et al. Meta-Learning based prototype-relation network for few-shot classification[J]. Neurocomputing, 383, 224-234(2020).

    [19] ZHOU F, ZHANG L, WEI W. Meta-generating deep attentive metric for few-shot classification[J]. IEEE Transactions on Circuits and Systems for Video Technology, 32, 6863-6873(2022).

    [20] AN Y X, XUE H, ZHAO X Y et al. Conditional self-supervised learning for few-shot classification[C], 19, 2140-2146(2021).

    [21] BERTINETTO L, HENRIQUES J F, TORR P H S et al. Meta-learning with differentiable closed-form solvers[C](2019).

    [24] SENDERA M, PRZEWIĘŹLIKOWSKI M, KARANOWSKI K et al. HyperShot: few-shot learning by kernel HyperNetworks[C], 2468-2477(2023).

    [25] ALLEN K, SHELHAMER E, SHIN H et al. Infinite mixture prototypes for few-shot learning[C], 232-241(2019).

    Tools

    Get Citation

    Copy Citation Text

    Liping YANG, Tianyang ZHANG, Yuyang WANG, Xiaohua GU. Intra-inter channel attention for few-shot classification[J]. Optics and Precision Engineering, 2023, 31(21): 3145

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Apr. 6, 2023

    Accepted: --

    Published Online: Jan. 5, 2024

    The Author Email: Liping YANG (yanglp@cqu.edu.cn)

    DOI:10.37188/OPE.20233121.3145

    Topics