Laser & Optoelectronics Progress, Volume. 59, Issue 14, 1415008(2022)

Depth-Adaptive Dynamic Neural Networks: A Survey

Yi Sun, Jian Li*, Xin Xu**, and Yuru Wang
Author Affiliations
  • College of Intelligence Science and Technology, National University of Defense Technology, Changsha 410000, Hunan , China
  • show less
    References(73)

    [2] Arora S, Cohen N, Hazan E. On the optimization of deep networks: implicit acceleration by overparameterization[C](2018).

    [3] Guo S, Alvarez J M, Salzmann M. Expandnets: linear over-parameterization to train compact convolutional networks[C], 33, 1298-1310(2020).

    [13] Wang X L, Zhang R F, Kong T et al. Solov2: dynamic and fast instance segmentation[C], 33, 17721-17732(2020).

    [26] Tan M X, Le Q V. Efficientnet: rethinking model scaling for convolutional neural networks[C], 6105-6114(2019).

    [27] Molchanov P, Tyree S, Karras T et al. Pruning convolutional neural networks for resource efficient inference[C](2017).

    [29] Chen G, Choi W, Yu X et al. Learning efficient object detection models with knowledge distillation[C], 30, 743-752(2017).

    [33] Kaya Y, Hong S, Dumitras T. Shallow-deep networks: understanding and mitigating network overthinking[C], 3301-3310(2019).

    [34] Huang G, Chen D, Li T et al. Multi-scale dense networks for resource efficient image classification[C](2018).

    [35] Zhou W, Xu C, Ge T et al. Bert loses patience: fast and robust inference with early exit[C], 33, 18330-18341(2020).

    [37] Wołczyk M, Wójcik B, Bałazy K et al. Zero time waste: recycling predictions in early exit neural networks[C], 34, 2516-2528(2021).

    [43] Chen X, Dai H, Li Y et al. Learning to stop while learning to predict[C], 1520-1530(2020).

    [44] Liu Y J, Meng F D, Zhou J et al. Faster depth-adaptive transformers[C], 35, 13424-13432(2021).

    [49] Wang X, Li Y. Harmonized dense knowledge distillation training for multi-exit architectures[C], 35, 10218-10226(2021).

    [57] Fan A, Grave E, Joulin A. Reducing transformer depth on demand with structured dropout[C](2020).

    [59] Di J L, Tang J, Wu J et al. Research progress in the applications of convolutional neural networks in optical information processing[J]. Laser & Optoelectronics Progress, 58, 1600001(2021).

    [61] Xiao W X, Li H F, Zhang Y F et al. Medical image fusion based on multi-scale feature learning and edge enhancement[J]. Laser & Optoelectronics Progress, 59, 0617029(2022).

    [63] Ma T H, Tan H, Li T Q et al. Road extraction from GF-1 remote sensing images based on dilated convolution residual network with multi-scale feature fusion[J]. Laser & Optoelectronics Progress, 58, 0228001(2021).

    [69] Li H, Xu Z, Taylor G et al. Visualizing the loss landscape of neural nets[C], 6391-6401(2018).

    [70] Nguyen Q, Hein M. The loss surface and expressivity of deep convolutional neural networks[C](2018).

    [72] Sener O, Koltun V. Multi-task learning as multi-objective optimization[C], 525-536(2018).

    Tools

    Get Citation

    Copy Citation Text

    Yi Sun, Jian Li, Xin Xu, Yuru Wang. Depth-Adaptive Dynamic Neural Networks: A Survey[J]. Laser & Optoelectronics Progress, 2022, 59(14): 1415008

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Machine Vision

    Received: Apr. 12, 2022

    Accepted: May. 23, 2022

    Published Online: Jul. 1, 2022

    The Author Email: Jian Li (lijian@nudt.edu.cn), Xin Xu (xinxu@nudt.edu.cn)

    DOI:10.3788/LOP202259.1415008

    Topics