Journal of Optoelectronics · Laser, Volume. 36, Issue 3, 324(2025)

A novel retinal vascular image segmentation method based on STB and FSASC technology

LIU Hui1 and ZHU Zhengwei1,2、*
Author Affiliations
  • 1School of Information Engineering, Southwest University of Science and Technology, Mianyang, Sichuan 621010, China
  • 2Robot Technology Used for Special Environment Key Laboratory of Sichuan Province, Mianyang, Sichuan 621010, China
  • show less

    Retinal vascular image segmentation is an important and difficult task in medical image analysis,and it is difficult for conventional methods to detect small and dense vascular structures effectively.To solve this problem,a novel high-precision and high-accuracy retinal vascular segmentation method that combines swin transformer block (STB) and full-scale attention skip connection technology (FSASC) is proposed.By constructing a U-shaped encoder-decoder network,the proposed method realizes self-attention from local to global,so that the proposed model can pay more attention to key vascular features.FSASC structure is used for fusing different features,which provides a simple and powerful mechanism for the proposed model to learn multi-scale semantic and spatial information.The proposed method is tested by using open datasets DRIVE and STARE.The experimental results show that the method can achieve high-quality and high-precision segmentation for retinal vascular structures.Compared with Unet and other methods,the proposed method has better performance in both the detailed feature segmentation and the segmentation accuracy.

    Tools

    Get Citation

    Copy Citation Text

    LIU Hui, ZHU Zhengwei. A novel retinal vascular image segmentation method based on STB and FSASC technology[J]. Journal of Optoelectronics · Laser, 2025, 36(3): 324

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Received: Oct. 8, 2023

    Accepted: Mar. 21, 2025

    Published Online: Mar. 21, 2025

    The Author Email: ZHU Zhengwei (zhuzwin@163.com)

    DOI:10.16136/j.joel.2025.03.0521

    Topics