Chinese Journal of Lasers, Volume. 52, Issue 9, 0907107(2025)
Ultra‐Wide‐Angle Fundus Images Stitching Based on Bidirectional Linear Weight Fusion
In ophthalmic medical imaging, the stitching of ultra-wide-angle fundus images is essential for comprehensively observing and assessing patients' retinal health. This technology provides a broader visual field, allowing doctors to gain a more intuitive understanding of the entire fundus region, thereby offering crucial support for early screening, diagnosis, and treatment planning of diseases. However, stitching ultra-wide-angle fundus images faces multiple challenges, particularly in multi-scene and multi-angle shooting scenarios in practical applications. Variations in perspective and exposure differences can lead to significant issues in image matching and fusion, such as the formation of stitching seams. This phenomenon not only affects image quality but may also obscure the identification of lesion areas by doctors. To address these challenges, we aim to develop an efficient algorithm that ensures high-quality, seamless stitching under varying exposure conditions and angles. By minimizing stitching seams and enhancing image smoothness, we seek to provide more precise and reliable technical support for the diagnosis and monitoring of fundus diseases.
We present a novel image-stitching approach based on computer vision to address critical challenges in stitching ultra-wide-angle fundus images. The speeded-up robust features (SURF) algorithm was first employed to extract key feature points that accurately depict prominent regions, such as bifurcation points of retinal vessels and structural boundaries, from fundus images. Potential correspondences between feature points in different images were identified through initial matching. However, the initial matching outcomes may include a considerable number of mismatched points, affecting stitching accuracy. To refine the results, the random sample consensus (RANSAC) algorithm was applied after initial matching. Through an iterative approach, the RANSAC algorithm eliminates mismatched points and preserves true feature matches, ultimately deriving an accurate transformation matrix for geometric image registration. To address stitching seam issues caused by varying viewpoints and exposure differences, this study introduces an innovative bidirectional linear weight fusion method. This method followed a structured process. Firstly, the center point of the overlapping area was extracted, and an image rotation alignment technique was used to ensure correct geometric alignment of the images. Then, weights were assigned to overlapping areas, forming a bidirectional linear weight mask, enabling the pixel values in the transition area to be smoothly fused. Then, a mask with linearly decreasing weights was generated, ensuring a smooth transition between images and effectively eliminating the stitching seams caused by exposure variations.
Through an experimental verification, the proposed stitching algorithm—combining SURF and bidirectional linear weight fusion—demonstrates significant performance advantages under various exposure conditions and viewing angles. Compared with the traditional algorithms, such as maximum fusion algorithm and gradual in and gradual out fusion algorithm, this algorithm achieves superior visual effects and smoother stitching. Experimental results show that this algorithm significantly reduces the stitching saliency when processing images with substantial exposure differences. This improvement is reflected as a 50.43% reduction in the average gradient and an 11.91% decrease in standard deviation in the stitching area compared to traditional algorithms, indicating a significant enhancement in seamlessness. Additionally, this algorithm effectively preserves image integrity. Information entropy, a key metric for measuring image information content, is only 3.13% lower than that of traditional algorithms. This finding suggests that while weight fusion eliminates stitching seams, the overall richness of image information remains nearly intact. As a result, the stitched images are not only more visually coherent but also retain critical medical details, providing reliable support for fundus disease diagnosis.
Based on experimental results and theoretical analysis, we propose a fundus image stitching method that integrates SURF and bidirectional linear weight fusion, demonstrating excellent performance in addressing multi-scene and multi-angle stitching challenges. SURF extracts precise feature points, while the RANSAC algorithm ensures geometric registration, thereby enhancing stitching accuracy. To resolve exposure differences in overlapping areas, a bidirectional linear weight mask is designed for effectively eliminating stitching seams and significantly improving image smoothness and visual coherence. Experimental results further confirm that this algorithm outperforms traditional approaches in terms of mean gradient and standard deviation in stitched areas while also preserving information integrity. The slight 3.13% reduction in information entropy indicates that the method effectively balances seamless stitching with the retention of medical image details. This advancement is particularly valuable for medical diagnostic applications requiring high-precision panoramic fundus images, such as the early detection and monitoring of diabetic retinopathy and glaucoma. In conclusion, the proposed algorithm not only achieves high-precision, seamless stitching, but also introduces innovative tools and techniques in the field of ophthalmic medical imaging. These findings highlight its broad clinical applicability and potential for extension to more complex medical image analysis scenarios, fostering advancements in medical imaging technology.
Get Citation
Copy Citation Text
Guilin Liu, Dewen Xu, Lin Ji, Yun Xiao, Xin Miu, Wei Xia, Yunhai Zhang. Ultra‐Wide‐Angle Fundus Images Stitching Based on Bidirectional Linear Weight Fusion[J]. Chinese Journal of Lasers, 2025, 52(9): 0907107
Category: Biomedical Optical Imaging
Received: Dec. 21, 2024
Accepted: Jan. 31, 2025
Published Online: Apr. 24, 2025
The Author Email: Zhang Yunhai (zhangyh@sibet.ac.cn)
CSTR:32183.14.CJL241481