Journal of Optoelectronics · Laser, Volume. 35, Issue 12, 1267(2024)

Lightweight stereoscopic image quality assessment method combining peripheral vision

WANG Yang1,2, JIA Xiran1,2, LONG Haiyan1,2, and HAN Liying1,2
Author Affiliations
  • 1School of Electronic and Information Engineering, Hebei University of Technology, Tianjin 300401, China
  • 2Tianjin Key Laboratory of Electronic Materials & Devices, Hebei University of Technology, Tianjin 300401, China
  • show less
    References(25)

    [1] [1] KANG L, YE P, LI Y, et al. Convolutional neural networks for no-reference image quality assessment[C]//IEEE Conference on Computer Vision and Pattern Recognition, June 23-28, 2014, Columbus, OH, USA. New York: IEEE Computer Society, 2014: 1733-1740.

    [2] [2] MENG F, LI S M, CHANG Y L. No-reference stereoscopic image quality assessment based on the human visual system[C]//IEEE International Conference on Acoustics, Speech and Signal Processing, June 6-11, 2021, Toronto, ON, Canada. New York: IEEE, 2021: 2100-2104.

    [3] [3] XU J, ZHOU W, CHEN Z, et al. Binocular rivalry oriented predictive autoencoding network for blind stereoscopic image quality measurement[J]. IEEE Transactions on Instrumentation and Measurement, 2021, 70: 1-13.

    [4] [4] SI J, HUANG B, YANG H, et al. A no-reference stereoscopic image quality assessment network based on binocular interaction and fusion mechanisms[J]. IEEE Transactions on Image Processing, 2022, 31: 3066-3080.

    [5] [5] OUSSAMA M, CHETOUANI A. End-to-end deep multi-score model for no-reference stereoscopic image quality assessment[C]//IEEE International Conference on Image Processing, October 16-19, 2022, Bordeaux, France. New York: IEEE, 2022: 2721-2725.

    [6] [6] HASAN M M, ISLAM M A, RAHMAN S, et al. No-reference quality assessment of transmitted stereoscopic videos based on human visual system[J]. Applied Sciences, 2022, 12(19): 10090.

    [8] [8] LIU Y, HUANG B Q, YUE G H. Two-stream interactive network based on local and global information for no-reference stereoscopic image quality assessment[J]. Journal of Visual Communication and Image Representation, 2022, 87: 103586.

    [9] [9] JEFF J. Designing with the mind in mind: simple guide to understanding user[M]. San Francisco: Elsevier Science & Technology, 2011.

    [10] [10] OUSSAMA M, CHETOUANI A, HACHOUF F, et al.3D saliency guided deep quality predictor for no-reference stereoscopic images[J]. Neurocomputing, 2022, 478: 22-36.

    [11] [11] ACHANT R, HEMAM S, ESTRAD F, et al. Frequency-tuned salient region detection[C]//IEEE Conference on Computer Vision and Pattern Recognition, June 20-25, 2009, Miami, FL, USA. New York: IEEE, 2009: 1597-1604.

    [12] [12] DAVSON H. Physiology of the eye[M]. New York: Pergamon Press, 1990.

    [13] [13] CHEN M J, SU C C, KWON D K, et al. Full-reference quality assessment of stereopairs accounting for rivalry[J]. Signal Processing: Image Communication, 2013, 28(9): 1143-1155.

    [14] [14] SI J W, YANG H, HUANG B X. A full-reference stereoscopic image quality assessment index based on stable aggregation of monocular and binocular visual features[J]. IET Image Process, 2021, 15(8): 1629-1643.

    [15] [15] ZHANG H, HU X W, GOU R Y, et al. Rich structural index for stereoscopic image quality assessment[J]. Sensors, 2022, 22(2): 499.

    [16] [16] MA J, XU G M, HAN X Y. Reduced-reference 3D image quality measurement via spatial to gradient domain feature aggregation[C]//2021 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, August 4-6, 2021, Electr Network, New York. IEEE, 2021: 1-6.

    [17] [17] LIU Y, TANG C, ZHENG Z, et al. No-reference stereoscopic image quality evaluator with segmented monocular features and perceptual binocular features[J]. Neurocomputing, 2020, 405: 126-137.

    [18] [18] HU J B, WANG X J, CHAI X L, et al. Deep network based stereoscopic image quality assessment via binocular summing and differencing[J]. Journal of Visual Communication and Image Representation, 2022, 82: 103420.

    [19] [19] BOURBIA S, KARIN A, CHETOUANI A, et al. A multi-task convolutional neural network for blind stereoscopic image quality assessment using naturalness analysis[C]//IEEE International Conference on Image Processing, September 19-22, 2021, Anchorage, AK, USA. New York: IEEE, 2021: 1434-1438.

    [20] [20] LI C F, YUN L X, CHEN H, et al. No-reference stereoscopic image quality assessment using 3D visual saliency maps fused with three-channel convolutional neural network[J]. Signal, Image and Video Processing, 2022, 16(1): 273-281.

    [21] [21] SHEN L L, CHEN X F, PAN Z Q, et al. No-reference stereoscopic image quality assessment based on global and local content characteristics[J]. Neurocomputing, 2021, 424: 132-142.

    [22] [22] WU L X, WANG S, SANG Q B. No-reference stereo image quality assessment based on transfer learning[J]. Journal of New Media, 2022, 4(3): 125-135.

    [23] [23] SIM K, YANG J C, LU W, et al. Blind stereoscopic image quality evaluator based on binocular semantic and quality channels[J]. IEEE Transactions on Multimedia.2021, 24: 1389-1398.

    [24] [24] LI S M, LI Y Y, HAN Y T. Stereoscopic image quality assessment considering visual mechanism and multi-loss constraints[J]. Journal of Visual Communication and Image Representation, 2021, 79: 103255.

    [25] [25] ZHOU W, CHEN Z, LI W. Dual-stream interactive networks for no-reference stereoscopic image quality assessment[J]. IEEE Transactions on Image Processing, 2019,8(28): 3946-3958.

    [26] [26] OH H, AHN S, KIM J, et al. Blind deep S3D image quality evaluation via local to global feature aggregation[J]. IEEE Transactions on Image Processing, 2017,10(26): 4923-4936.

    Tools

    Get Citation

    Copy Citation Text

    WANG Yang, JIA Xiran, LONG Haiyan, HAN Liying. Lightweight stereoscopic image quality assessment method combining peripheral vision[J]. Journal of Optoelectronics · Laser, 2024, 35(12): 1267

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Apr. 9, 2023

    Accepted: Dec. 31, 2024

    Published Online: Dec. 31, 2024

    The Author Email:

    DOI:10.16136/j.joel.2024.12.0178

    Topics