Laser Technology, Volume. 48, Issue 5, 752(2024)
Two-step phase unwrapping based on swin-UNet-denoise and least square method
[1] [1] GHIGLIA D C, PRITT M D. Two-dimensional phase unwrapping: Theory, algorithms, and software[M]. New York: Wiley, 1998: 1-59.
[2] [2] YU H, LAN Y, YUAN Z, et al. Phase unwrapping in InSAR: A review[J]. IEEE Geoscience and Remote Sensing Magazine, 2019, 7(1): 40-58.
[3] [3] SONG S M, NAPEL S, PELC N J, et al. Phase unwrapping of MR phase images using Poisson equation[J]. IEEE Transactions on Image Processing: A publication of the IEEE Signal Processing Society, 1995, 4(5): 667-676.
[4] [4] WYANT J C. Interferogram analysis: Digital fringe pattern measurement techniques[J]. Optical Engineering, 1993, 32(11): 2987-2988.
[5] [5] GOLDSTEIN R M, ZEBKER H A, WERNER C L. Satellite radar interferometry: Two-dimensional phase unwrapping[J]. Radio Science, 1988, 23(4): 713-720.
[6] [6] XU W, CUMMING I. A region-growing algorithm for InSAR phase unwrapping[J]. IEEE Transactions on Geoscience and Remote Sensing, 1999, 37(1): 124-134.
[7] [7] FLYNN T J. Consistent 2-D phase unwrapping guided by a quality map[C] //1996 International Geoscience and Remote Sensing Symposium. Lincoln, USA: IEEE Press, 1996: 2057-2059.
[8] [8] FLYNN T J. Two-dimensional phase unwrapping with minimum weighted discontinuity[J]. Journal of the Optical Society of America, 1997, A14(10): 2692-2701.
[9] [9] BONE D J. Fourier fringe analysis: The two-dimensional phase unwrapping problem[J]. Applied Optics, 1991, 30(25): 3627-3632.
[10] [10] GHIGLIA D C, ROMERO L A. Minimum LP-norm two-dimensional phase unwrapping[J]. Journal of the Optical Society of America, 1996, A13(10): 1999-2013.
[11] [11] HYUN J S, ZHANG S. Enhanced two-frequency phase-shifting method[J]. Applied Optics, 2016, 55(16): 4395-4401.
[12] [12] ZHANG S. Digital multiple wavelength phase shifting algorithm[J]. Proceedings of the SPIE, 2009, 7432: 74320N.
[13] [13] WANG K Q, LI Y, QIAN K M, et al. One-step robust deep learning phase unwrapping[J]. Optics Express, 2019, 27(10): 15100-15115.
[14] [14] ZHOU L, YU H, PASCAZIO V, et al. PU-GAN: A one-step 2-D InSAR phase unwrapping based on conditional generative adversarial network[J]. IEEE Transactions on Geoscience and Remote Sensing, 2022, 60: 1-10.
[15] [15] FANG Q, XIA H T, SONG Q H, et al. Speckle denoising based on a deep learning via conditional generative adversarial network in digital holographic interferometry[J]. Optics Express, 2022, 30(12): 20666-20683.
[16] [16] SPOORTHI G E, GORTHI S, GORTHI R K S S. PhaseNet: A deep convolutional neural network for two-dimensional phase unwrapping[J]. IEEE Signal Processing Letters, 2018, 26(1): 54-58.
[17] [17] SPOORTHI G E, GORTHI R K S S, GORTHI S. PhaseNet 2.0: Phase unwrapping of noisy data based on deep learning approach[J]. IEEE Transactions on Image Processing, 2020, 29: 4862-4872.
[19] [19] WANG K Q, KEMAO Q, DI J L, et al. Deep learning spatial phase unwrapping: A comparative review[J]. Advanced Photonics Nexus, 2022, 1(1): 014001.
[20] [20] SILVIO M, PASCAL P. Quantitative appraisal for noise reduction in digital holographic phase imaging[J]. Optics Express, 2016, 24(13): 14322-14343.
[21] [21] FAN Ch M, LIU T J, LIU K H. SUNet: Swin transformer UNet for image denoising[C] //2022 IEEE International Symposium on Circuits and Systems (ISCAS). Austin, USA: IEEE, 2022: 2333-2337.
[22] [22] LIU Z, HU H, LIN Y, et al. Swin transformer V2: Scaling up capacity and resolution[C]//2022 IEEE/ CVF Conference on Computer Vision and Pattern Recognition. New Orlean(LA), USA, IEEE Press, 2022: 11999-12009.
[23] [23] HE K M, ZHANG X Y, REN S Q, et al. Delving deep into rectifiers: Surpassing human-level performance on imagenet classification[C] //2015 IEEE International Conference on Computer Vision(ICCV). New Yorks, USA: IEEE Press, 2015: 1026-1034.
Get Citation
Copy Citation Text
LIAO Houzhang, KONG Yong, ZHANG He, WU Huihui, TONG Xiaofan, ZHAO Li. Two-step phase unwrapping based on swin-UNet-denoise and least square method[J]. Laser Technology, 2024, 48(5): 752
Category:
Received: Aug. 17, 2023
Accepted: Dec. 2, 2024
Published Online: Dec. 2, 2024
The Author Email: KONG Yong (kkyy7757@aliyun.com)