Laser & Optoelectronics Progress, Volume. 60, Issue 22, 2200003(2023)

Review of Multi-Exposure Image Fusion Methods

Xinli Zhu1, Yasheng Zhang2, Yuqiang Fang2、*, Xitao Zhang2, Jieping Xu2, and Di Luo1
Author Affiliations
  • 1Department of Graduate Management, Space Engineering University, Beijing 101416, China
  • 2Space Engineering University, Beijing 101416, China
  • show less
    References(93)

    [1] Yan Q S. Research on image reconstruction method with high dynamic range[D], 20-26(2019).

    [2] Zhao Y. Research on multi-exposure image fusion algorithm[D], 10-15(2020).

    [3] Wang C M. Research on key technologies of multi-exposure image fusion[D], 10-20(2015).

    [4] Xu F, Liu J H, Song Y M et al. Multi-exposure image fusion techniques: a comprehensive review[J]. Remote Sensing, 14, 771(2022).

    [5] Pece F, Kautz J. Bitmap movement detection: HDR for dynamic scenes[C](2011).

    [7] Li S T, Kang X D. Fast multi-exposure image fusion with median filter and recursive filter[J]. IEEE Transactions on Consumer Electronics, 58, 626-632(2012).

    [8] Lee S H, Park J S, Cho N I. A multi-exposure image fusion based on the adaptive weights reflecting the relative pixel intensity and global gradient[C], 1737-1741(2018).

    [9] Xu Y D, Sun B B. Color-compensated multi-scale exposure fusion based on physical features[J]. Optik, 223, 165494(2020).

    [10] Ulucan O, Karakaya D, Turkan M. Multi-exposure image fusion based on linear embeddings and watershed masking[J]. Signal Processing, 178, 107791(2021).

    [11] Li M, Kong W W, Hu Y P et al. Adaptive multi-exposure image fusion based on weighted least squares[J]. China Sciencepaper, 16, 723-728(2021).

    [12] Kinoshita Y, Kiya H. Scene segmentation-based luminance adjustment for multi-exposure image fusion[J]. IEEE Transactions on Image Processing, 28, 4101-4116(2019).

    [13] Wang C M, He C, Xu M F. Fast exposure fusion of detail enhancement for brightest and darkest regions[J]. The Visual Computer, 37, 1233-1243(2021).

    [14] Goshtasby A A. Fusion of multi-exposure images[J]. Image and Vision Computing, 23, 611-618(2005).

    [15] Ma K D, Wang Z. Multi-exposure image fusion: a patch-wise approach[C], 1717-1721(2015).

    [16] Huang F, Zhou D M, Nie R C et al. A color multi-exposure image fusion approach using structural patch decomposition[J]. IEEE Access, 6, 42877-42885(2018).

    [17] Li W Z. Multi-exposure image fusion based on local features of scene[J]. Journal of Computer Applications, 40, 2365-2371(2020).

    [18] Li H, Ma K D, Yong H W et al. Fast multi-scale structural patch decomposition for multi-exposure image fusion[J]. IEEE Transactions on Image Processing, 29, 5805-5816(2020).

    [19] Li H, Chan T N, Qi X B et al. Detail-preserving multi-exposure fusion with edge-preserving structural patch decomposition[J]. IEEE Transactions on Circuits and Systems for Video Technology, 31, 4293-4304(2021).

    [20] Wang S P, Zhao Y. A novel patch-based multi-exposure image fusion using super-pixel segmentation[J]. IEEE Access, 8, 39034-39045(2020).

    [21] Shen R, Cheng I, Shi J B et al. Generalized random walks for fusion of multi-exposure images[J]. IEEE Transactions on Image Processing, 20, 3634-3646(2011).

    [22] Li Z G, Zheng J H, Rahardja S. Detail-enhanced exposure fusion[J]. IEEE Transactions on Image Processing, 21, 4672-4676(2012).

    [23] Song M L, Tao D C, Chen C et al. Probabilistic exposure fusion[J]. IEEE Transactions on Image Processing, 21, 341-357(2012).

    [24] Liu S G, Zhang Y. Detail-preserving underexposed image enhancement via optimal weighted multi-exposure fusion[J]. IEEE Transactions on Consumer Electronics, 65, 303-311(2019).

    [25] Ma K D, Duanmu Z F, Yeganeh H et al. Multi-exposure image fusion by optimizing a structural similarity index[J]. IEEE Transactions on Computational Imaging, 4, 60-72(2018).

    [26] Burt P J, Kolczynski R J. Enhanced image capture through fusion[C], 173-182(1993).

    [27] Mertens T, Kautz J, van Reeth F. Exposure fusion[C], 382-390(2007).

    [28] Shen J B, Zhao Y, Yan S C et al. Exposure fusion using boosting Laplacian pyramid[J]. IEEE Transactions on Cybernetics, 44, 1579-1590(2014).

    [29] Li S T, Kang X D, Hu J W. Image fusion with guided filtering[J]. IEEE Transactions on Image Processing, 22, 2864-2875(2013).

    [30] Yan Q S, Zhu Y, Zhou Y L et al. Enhancing image visuality by multi-exposure fusion[J]. Pattern Recognition Letters, 127, 66-75(2019).

    [31] Li Z G, Wei Z, Wen C Y et al. Detail-enhanced multi-scale exposure fusion[J]. IEEE Transactions on Image Processing, 26, 1243-1252(2017).

    [32] Singh H, Kumar V, Bhooshan S. A novel approach for detail-enhanced exposure fusion using guided filter[J]. The Scientific World Journal, 2014, 659217(2014).

    [33] Wang Q T, Chen W H, Wu X M et al. Detail-enhanced multi-scale exposure fusion in YUV color space[J]. IEEE Transactions on Circuits and Systems for Video Technology, 30, 2418-2429(2020).

    [34] Kou F, Li Z G, Wen C Y et al. Edge-preserving smoothing pyramid based multi-scale exposure fusion[J]. Journal of Visual Communication and Image Representation, 53, 235-244(2018).

    [35] Yang Y, Cao W, Wu S Q et al. Multi-scale fusion of two large-exposure-ratio images[J]. IEEE Signal Processing Letters, 25, 1885-1889(2018).

    [36] Tang L, Lu R S, Shi Y Q et al. High dynamic range imaging method based on YCbCr spatial fusion[J]. Laser & Optoelectronics Progress, 59, 1415029(2022).

    [37] Liu W H, Ma B Y. Multiexposure image fusion method based on feature weight of image sequence[J]. Laser & Optoelectronics Progress, 59, 0811008(2022).

    [38] Wu L F, Hu J B, Yuan C. Exposure fusion based on improved exposure evaluation and double pyramids[J]. Laser & Optoelectronics Progress, 58, 1410005(2021).

    [39] Zhang W, Cham W K. Gradient-directed multiexposure composition[J]. IEEE Transactions on Image Processing, 21, 2318-2323(2012).

    [40] Paul S, Sevcenco I S, Agathoklis P. Multi-exposure and multi-focus image fusion in gradient domain[J]. Journal of Circuits, Systems and Computers, 25, 1650123(2016).

    [41] Liu Y Y, Zhou D M, Nie R C et al. Construction of high dynamic range image based on gradient information transformation[J]. IET Image Processing, 14, 1327-1338(2020).

    [42] Gu B, Li W J, Wong J et al. Gradient field multi-exposure images fusion for high dynamic range image visualization[J]. Journal of Visual Communication and Image Representation, 23, 604-610(2012).

    [43] Wang J H, Liu H Z, He N. Exposure fusion based on sparse representation using approximate K-SVD[J]. Neurocomputing, 135, 145-154(2014).

    [44] Shao H, Jiang G Y, Yu M et al. Halo-free multi-exposure image fusion based on sparse representation of gradient features[J]. Applied Sciences, 8, 1543(2018).

    [45] Yang Y, Wu J H, Huang S Y et al. Multiexposure estimation and fusion based on a sparsity exposure dictionary[J]. IEEE Transactions on Instrumentation and Measurement, 69, 4753-4767(2020).

    [46] Kalantari N K, Ramamoorthi R. Deep high dynamic range imaging of dynamic scenes[J]. ACM Transactions on Graphics, 36, 1-12(2017).

    [47] Li H, Zhang L. Multi-exposure fusion with CNN features[C], 1723-1727(2018).

    [48] Liu Q G, Leung H. Variable augmented neural network for decolorization and multi-exposure fusion[J]. Information Fusion, 46, 114-127(2019).

    [49] Chen Y Y, Yu M, Jiang G Y et al. End-to-end single image enhancement based on a dual network cascade model[J]. Journal of Visual Communication and Image Representation, 61, 284-295(2019).

    [50] Cai J R, Gu S H, Zhang L et al. Learning a deep single image contrast enhancer from multi-exposure images[J]. IEEE Transactions on Image Processing, 27, 2049-2062(2018).

    [51] Prabhakar K R, Srikar V S, Babu R V. DeepFuse: a deep unsupervised approach for exposure fusion with extreme exposure image pairs[C], 4724-4732(2017).

    [52] Qi Y, Zhou S B, Zhang Z H et al. Deep unsupervised learning based on color un-referenced loss functions for multi-exposure image fusion[J]. Information Fusion, 66, 18-39(2021).

    [53] Han D, Li L, Guo X J et al. Multi-exposure image fusion via deep perceptual enhancement[J]. Information Fusion, 79, 248-262(2022).

    [54] Zhang Y, Liu Y, Sun P et al. IFCNN: a general image fusion framework based on convolutional neural network[J]. Information Fusion, 54, 99-118(2020).

    [55] Ma K D, Duanmu Z F, Zhu H W et al. Deep guided learning for fast multi-exposure image fusion[J]. IEEE Transactions on Image Processing, 29, 2808-2819(2019).

    [56] Xu H, Ma J Y, Jiang J J et al. U2Fusion: a unified unsupervised image fusion network[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44, 502-518(2022).

    [57] Gao M Y, Wang J F, Chen Y et al. An improved multi-exposure image fusion method for intelligent transportation system[J]. Electronics, 10, 383(2021).

    [58] Xu H, Ma J Y, Zhang X P. MEF-GAN: multi-exposure image fusion via generative adversarial networks[J]. IEEE Transactions on Image Processing, 29, 7203-7216(2020).

    [59] Yang Z G, Chen Y P, Le Z L et al. GANFuse: a novel multi-exposure image fusion method based on generative adversarial networks[J]. Neural Computing and Applications, 33, 6133-6145(2021).

    [60] Le Z L, Huang J, Xu H et al. UIFGAN: an unsupervised continual-learning generative adversarial network for unified image fusion[J]. Information Fusion, 88, 305-318(2022).

    [61] Zhou H B, Hou J L, Zhang Y D et al. Unified gradient- and intensity-discriminator generative adversarial network for image fusion[J]. Information Fusion, 88, 184-201(2022).

    [62] Qi Y. Research on multi-exposure image fusion algorithm based on convolutional neural network[D], 12-15(2020).

    [63] Khan E A, Akyuz A O, Reinhard E. Ghost removal in high dynamic range images[C], 2005-2008(2007).

    [64] Jacobs K, Loscos C, Ward G. Automatic high-dynamic range image generation for dynamic scenes[J]. IEEE Computer Graphics and Applications, 28, 84-93(2008).

    [65] Zhang W, Cham W K. Gradient-directed composition of multi-exposure images[C], 530-536(2010).

    [66] Zhang W, Hu S N, Liu K. Patch-based correlation for deghosting in exposure fusion[J]. Information Sciences, 415/416, 19-27(2017).

    [67] Ma K D, Li H, Yong H W et al. Robust multi-exposure image fusion: a structural patch decomposition approach[J]. IEEE Transactions on Image Processing, 26, 2519-2532(2017).

    [68] Hu Y T, Zhen R W, Sheikh H. CNN-based deghosting in high dynamic range imaging[C], 4360-4364(2019).

    [69] Liu Y, Wang Z F. Dense SIFT for ghost-free multi-exposure fusion[J]. Journal of Visual Communication and Image Representation, 31, 208-224(2015).

    [70] Ye X R, Li Z P, Xu C. Ghost-free multi-exposure image fusion technology based on the multi-scale block LBP operator[J]. Electronics, 11, 3129(2022).

    [71] Kang S B U M. High dynamic range video (Conference Paper)[J]. ACM Transactions on Graphics, 319-325(2003).

    [72] Hu J, Gallo O, Pulli K. Exposure stacks of live scenes with hand-held cameras[M]. Fitzgibbon A, Lazebnik S, Perona P, et al. Computer vision-ECCV 2012. Lecture notes in computer science, 7572, 499-512(2012).

    [73] Zimmer H, Bruhn A, Weickert J. Freehand HDR imaging of moving scenes with simultaneous resolution enhancement[J]. Computer Graphics Forum, 30, 405-414(2011).

    [74] Sie W R, Hsu C T. Alignment-free exposure fusion of image pairs[C], 1802-1806(2014).

    [75] Ulucan O, Ulucan D, Turkan M. Ghosting-free multi-exposure image fusion for static and dynamic scenes[J]. Signal Processing, 202, 108774(2023).

    [76] Xue X, Yue Z. Multi-view multi-exposure image fusion based on random walks model[M]. Lai S H, Lepetit V, Nishino K, et al. Computer vision-ACCV 2016 workshops. Lecture notes in computer science, 10118, 491-499(2017).

    [77] Trinidad M C, Martin-Brualla R, Kainz F et al. Multi-view image fusion[C], 4100-4109(2019).

    [78] Khan R, Yang Y, Liu Q et al. A ghostfree contrast enhancement method for multiview images without depth information[J]. Journal of Visual Communication and Image Representation, 78, 103175(2021).

    [79] Peng F Y, Zhang M J, Lai S M et al. Deep HDR reconstruction of dynamic scenes[C], 347-351(2018).

    [80] Ilg E, Mayer N, Saikia T et al. FlowNet 2.0: evolution of optical flow estimation with deep networks[C], 1647-1655(2017).

    [81] Prabhakar K R, Arora R, Swaminathan A et al. A fast, scalable, and reliable deghosting method for extreme exposure fusion[C](2019).

    [82] Deng Y P, Liu Q, Ikenaga T. Multi-scale contextual attention based HDR reconstruction of dynamic scenes[J]. Proceedings of SPIE, 11519, 115191F(2020).

    [83] Yan Q S, Gong D, Shi Q F et al. Attention-guided network for ghost-free high dynamic range imaging[C], 1751-1760(2020).

    [84] Yan Q S, Gong D, Shi J Q et al. Dual-attention-guided network for ghost-free high dynamic range imaging[J]. International Journal of Computer Vision, 130, 76-94(2022).

    [85] Reinhard E[M]. High dynamic range imaging: acquisition, display, and image-based lighting(2010).

    [91] Mertens T, Kautz J, van Reeth F. Exposure fusion: a simple and practical alternative to high dynamic range photography[J]. Computer Graphics Forum, 28, 161-171(2009).

    [92] Gallo O, Gelfandz N, Chen W C et al. Artifact-free high dynamic range imaging[C](2010).

    [93] Hu J, Gallo O, Pulli K et al. HDR deghosting: how to deal with saturation?[C], 1163-1170(2013).

    [94] Ma K D, Zeng K, Wang Z. Perceptual quality assessment for multi-exposure image fusion[J]. IEEE Transactions on Image Processing, 24, 3345-3356(2015).

    [95] Haghighat M B A, Aghagolzadeh A, Seyedarabi H. A non-reference image fusion metric based on mutual information of image features[J]. Computers & Electrical Engineering, 37, 744-756(2011).

    [96] Naidu V. Discrete cosine transform-based image fusion[J]. Defence Science Journal, 60, 48-54(2010).

    [97] Xydeas C S, Petrović V. Objective image fusion performance measure[J]. Electronics Letters, 36, 308(2000).

    [98] Hayat N, Imran M. Ghost-free multi exposure image fusion technique using dense SIFT descriptor and guided filter[J]. Journal of Visual Communication and Image Representation, 62, 295-308(2019).

    Tools

    Get Citation

    Copy Citation Text

    Xinli Zhu, Yasheng Zhang, Yuqiang Fang, Xitao Zhang, Jieping Xu, Di Luo. Review of Multi-Exposure Image Fusion Methods[J]. Laser & Optoelectronics Progress, 2023, 60(22): 2200003

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Reviews

    Received: Mar. 17, 2023

    Accepted: Apr. 3, 2023

    Published Online: Nov. 3, 2023

    The Author Email: Fang Yuqiang (fangyuqiang@nudt.edu.cn)

    DOI:10.3788/LOP230683

    Topics