Opto-Electronic Engineering, Volume. 44, Issue 9, 895(2017)

Multiband fusion image evaluation method based on correlation between subject and object evaluation

Ze Han and Suzhen Lin*
Author Affiliations
  • [in Chinese]
  • show less
    References(23)

    [1] [1] Li Shutao, Kang Xudong, Fang Leyuan, et al. Pixel-level image fusion: a survey of the state of the art [J]. Information Fusion, 2017, 33: 100–112.

    [2] [2] Ding Li, Huang Hua, Zang Yu. Image quality assessment using directional anisotropy structure measurement [J]. IEEE Transactions on Image Processing, 2017, 26(4): 1799–1809.

    [3] [3] Krasula L, Le Callet P, Fliegel K, et al. Quality assessment of sharpened images: challenges, methodology, and objective metrics [J]. IEEE Transactions on Image Processing, 2017, 26(3): 1496–1508.

    [4] [4] Vega M T, Mocanu D C, Stavrou S, et al. Predictive no-reference assessment of video quality [J]. Signal Pro-cessing: Image Communication, 2017, 52: 20–32.

    [5] [5] Alaql O Ghazinour K, Lu Cheng Chang. Classification of image distortions for image quality assessment [C]// Proceedings of International Conference on Computational Science and Computational Intelligence. 2016: 653–658.

    [6] [6] Yan Wen, Gong Fei, Zhou Ying, et al. Satellite cloud image fusion based on adaptive PCNN and NSST [J]. Opto-Electronic Engineering, 2016, 43(10): 70–76,83.

    [7] [7] Yin Ming, Duan Puhong, Chu Biao, et al. CT and MRI medical image fusion based on shift-invariant shearlet transform and compressed sensing [J]. Opto–Electronic Engineering, 2016, 43(8): 47–52.

    [8] [8] Zhang Xuedian, Wang Hong, Jiang Minshan, et al. Applica-tions of saliency analysis in focus image fusion [J]. Opto-Electronic Engineering, 2017, 44(4): 435–441.

    [9] [9] Liu Yu, Chen Xun, Peng Hu, et al. Multi-focus image fusion with a deep convolutional neural network [J]. Information Fusion, 2017, 36: 191–207.

    [10] [10] Zhang Kai, Wang Min, Yang Shuyuan. Multispectral and hyperspectral image fusion based on group spectral embed-ding and low-rank factorization [J]. IEEE Transactions on Geoscience and Remote Sensing, 2017, 55(3): 1363–1371.

    [11] [11] Wang Zhou, Bovik A C. A universal image quality index [J]. IEEE Signal Processing Letters, 2002, 9(3): 81–84

    [12] [12] He Guiqing, LiangFan, Xing Siyuan, et al. Study on algorithm evaluation of image fusion based on multi-hierarchical syn-thetic analysis[C]// Proceedings of 2016 IEEE International Conference on Signal Processing, Communications and Computing, 2016: 1–6.

    [13] [13] Zhu Yahui. Research on quality evaluation methods of infrared and visible image fusion[D]. Xi’an: Northwestern Polytechnical University, 2015.

    [14] [14] Xydeas C S, Petrovic V. Objective image fusion performance measure[J]. Electronics Letters, 2000, 36(4): 308–309.

    [15] [15] Piella G, Heijmans H. A new quality metric for image fusion [C]// Proceedings of 2003 International Conference on Image Processing, 2003, 2: III-173–176.

    [16] [16] Nizami I F, Majid M, Khurshid K. Efficient feature selection for blind image quality assessment based on natural scene sta-tistics [C]// Proceedings of 2017 14th International Bhurban Conference on Applied Sciences and Technology, 2017: 318–322.

    [17] [17] Ding Yong, Zhao Yang, Zhao Xinyu. Image quality assessment based on multi-feature extraction and synthesis with support vector regression [J]. Signal Processing: Image Communica-tion, 2017, 54: 81–92.

    [18] [18] Mukherjee R, Debattista K, Bashford-Rogers T, et al. Objective and subjective evaluation of high dynamic range video com-pression [J]. Signal Processing: Image Communication, 2016, 47: 426–437.

    [19] [19] Liu Yu, Liu Shuping, Wang Zengfu. A general framework for image fusion based on multi-scale transform and sparse rep-resentation [J]. Information Fusion, 2015, 24: 147–164.

    [20] [20] Jagalingam P, Hegde A V. A Review of quality metrics for fused image [J]. Aquatic Procedia, 2015, 4: 133–142.

    [21] [21] Zhang Xiaoli, LI Xiongfei, LI Jun. Validation and correlation analysis of metrics for evaluating performance of image fu-sion[J]. Acta Automatica Sinica, 2014, 40(2): 306–315.

    [22] [22] Han Yu, Cai Yunze, Cao Yin, et al. A new image fusion per-formance metric based on visual information fidelity[J]. Infor-mation Fusion, 2013, 14(2): 127–135.

    [23] [23] Warne R T. Testing Spearman's hypothesis with advanced placement examination data[J]. Intelligence, 2016, 57: 87–95.

    Tools

    Get Citation

    Copy Citation Text

    Ze Han, Suzhen Lin. Multiband fusion image evaluation method based on correlation between subject and object evaluation[J]. Opto-Electronic Engineering, 2017, 44(9): 895

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: May. 30, 2017

    Accepted: --

    Published Online: Dec. 1, 2017

    The Author Email: Lin Suzhen (13835163417@163.com)

    DOI:10.3969/j.issn.1003-501x.2017.09.006

    Topics