Laser & Optoelectronics Progress, Volume. 60, Issue 22, 2200003(2023)

Review of Multi-Exposure Image Fusion Methods

Xinli Zhu1, Yasheng Zhang2, Yuqiang Fang2、*, Xitao Zhang2, Jieping Xu2, and Di Luo1
Author Affiliations
  • 1Department of Graduate Management, Space Engineering University, Beijing 101416, China
  • 2Space Engineering University, Beijing 101416, China
  • show less
    Figures & Tables(11)
    Pyramid framework
    Gradient direction of exposure sequence
    Fusion method based on sparse representation
    Flowchart of multi-exposure fusion method based on CNN features
    Framework of GANFuse method
    • Table 1. Summary of multi-exposure image fusion methods for static scenes

      View table

      Table 1. Summary of multi-exposure image fusion methods for static scenes

      TypeCategoryNameYear/SourceContribution
      Spatial domain method

      method

      Pixel-based

      Raman et al.62009/EUROGRAPHICSCompositing based bilateral filter
      Li et al.72012/IEEEFusion with median filter and recursive filter
      Lee et al.82018/IEEEAdaptive weighting reflects relative pixel intensity and global gradients
      Xu et al.92020/OptikUsing patterns of oriented edge agnitudes to extract local contrast
      Ulucan et al.102021/Signal ProcessingUsing linear embeddings and watershed masking to fusion
      Li et al.112021/China SciencepaperAdaptive weights constructed from pixel intensity and global gradients
      Kinoshita et al.122019/IEEESegmentation based on luminance distribution
      Wang et al.132021/Visual ComputerDetermined the region of enhancement(RoE)for each image
      Patch-based methodGoshtasbyet al.142005/Image and Vision ComputingFirstly multi-exposure fusion using image blocks
      Ma et al.152015/IEEEDecomposing image patch into signal strength,signal structure,and mean intensity
      Huang et al.162018/IEEEThree weight measurements build signal structure and mean intensity
      Li et al.172020/Journal of Computer ApplicationsLocal variance,local saliency features,and local visibility build weight graph
      Li et al.182020/IEEENon-normalized operations
      Li et al.192021/IEEEIncorporating the edge-preserving factors into mean intensity
      Wang et al.202020/IEEEUsing the super-pixel segmentation approach
      Optimization-based methodShen et al.212011/IEEEA generalized random walk framework
      Li et al.222012/IEEEUsing quadratic optimization method
      Liu et al.242019/IEEEUsing optimal weighted multi-exposure fusion mechanism
      Ma et al.252018/IEEEDescribing a gradient ascent-based algorithm

      Transform domain

      method

      Multi-scale decomposition-based methodBurt et al.261993/IEEEApplying a gradient pyramid model to multi-exposure fusion firstly
      Mertens et al.272007/IEEEWeighted Gaussian pyramid to Laplacian pyramid
      Shen et al.282014/IEEEBuilding weight graphs using local weights,global weights,and significance weights
      Li et al.292013/IEEEWeighted guided filtering on Gaussian pyramids
      Yan et al.302019/Pattern Recognition LettersA simulated exposure model for generating multiple images
      Li et al.312017/IEEEUsing guided filtering to break down image into base layer and detail layer
      Singh et al.322014/Scientific World JournalAdding details to Laplacian pyramid
      Wang et al.332020/IEEELaplacian pyramid in the YUV color space
      Kou et al.342018/Journal of Visual Communication and Image RepresentationEdge-preserving smoothing pyramid
      Yang et al.352018/IEEEGenerating a moderately exposed analog mapping image
      Tang et al.362022/Laser & Optoelectronics ProgressProposing a high dynamic range imaging method based on YCbCr spatial fusion
      Liu et al.372022/Laser & Optoelectronics ProgressA full sequence based on images is proposed multi-exposure image fusion method for feature weights
      Wu et al.382021/Laser & Optoelectronics ProgressProposing a multi-exposure image fusion method based on improved exposure evaluation and double pyramid
      Gradient-based methodZhang et al.392012/IEEEA two-dimensional Gaussian filter to calculate the gradient value and gradient direction
      Paul et al.402016/Journal of Circuits,Systems and ComputersAdd the gradient value of the chroma to the gradient value of the light value
      Liu et al.412020/IET Image ProcessingComputing luminance levels in the gradient domain
      Gu et al.422012/Journal of Visual Communication & Image RepresentationModifying the gradient field iteratively with twice average filtering and nonlinearly compressing in multi-scales
      Sparse representation-based methodWang et al.432014/NeurocomputingA novel recognition framework based on the discriminative sparse representation
      Shao et al.442018/Applied SciencesA halo-free multi-exposure fusion method based on sparse representation of gradient features
      Yang et al.452020/IEEEAn exposure fusion method with sparse decomposition and a sparsity exposure dictionary

      Deep learning

      method

      CNNKalantari et al.462017/ACM Transactions on GraphicsThe first learning-based technique to produce an HDR image
      Li et al.472018/IEEEUsing CNN to extract the features of each image
      Liu et al.482019/Information FusionProposing the FusionNet
      Chen et al.492019/Journal of Visual Communication and Image RepresentationConstructing a dual network cascade model
      Cai et al.502018/IEEEA large-scale multi-exposure image data
      Prabhakaret al.512017/IEEEFirst ever unsupervised deep learning method(DeepFuse)
      Qi et al.522021/Information FusionAn unsupervised deep learning approach based on quantitative evaluation
      Han et al.532022/ Information FusionProposing a depth-aware enhancement network
      Zhang et al.542020/ Information FusionProposing a general image fusion framework based on the convolutional neural network
      Ma et al.552019/IEEEMulti-exposure fusion network based on deep guided learning
      Xu et al.562022/IEEEAn unsupervised end-to-end image fusion network(U2Fusion)
      Gao et al.572021/ElectronicsApplying to the identification of traffic signs
      GANXu et al.582020/IEEEProposing a GAN-based multi-exposure image fusion network firstly(MEF-GAN)
      Yang et al.59

      2021/Neural Computing

      and Applications

      A novel GAN-based multi-exposure image fusion method(GANFuse)
      Le et al.602022/ Information FusionGenerative adversarial networks based on continuous learning(UIFGAN)
      Zhou et al.612022/ Information FusionProposing an image fusion method based on GAN(GIDGAN)
    • Table 2. Summary of multi-exposure image fusion methods for dynamic scenes

      View table

      Table 2. Summary of multi-exposure image fusion methods for dynamic scenes

      CategoryNameYear/SourceContribution
      Based on motion detectionKhan et al.632006/ICIPComputed weights iteratively to remove ghost
      Jacobs et al.642008/IEEECalculate local information entropy to detect moving pixels
      Pece et al.52010/IEEEUsing the median threshold to design the bitmap motion detection algorithm
      Zhang et al.652010/IEEEUsing gradient direction consistency as the basis for motion detection
      Zhang et al.662017/Information SciencesDetects motion by comparing the structural consistency of images
      Ma et al.672017/IEEECalculate the structural consistency of the direction of the components of the signal structure
      Hu et al.682019/IEEEProposing a convolutional neural network based deghosting algorithm
      Liu et al.692015/ Journal of Visual Communication and Image RepresentationA method based on dense scale invariant feature transform
      Ye et al.702022/ElectronicsProposing a ghost-free multi-exposure image fusion technique
      Based on image registrationKang et al.712003/ACM Transactions on GraphicsGradient-based optical flow estimation of the motion field
      Hu et al.722012/ECCVAligning images with local consistency in geometry and luminosity
      Zimmer et al.732011/Computer Graphics ForumA modern energy-based optic flow approach
      Sie et al.742014/IEEEDetermining the fusing map via Markov Random Field
      Ulucan et al.752023/Signal ProcessingExposure intensity mapping using histogram matching operators
      Xue et al.762016/Asian Conference on Computer VisionIntroducing a random walk model
      Trinidad et al.772019/IEEEA cascade feature extraction method based on optical flow
      Khan et al.782021/Journal of Visual Communication and Image RepresentationA method that relies on the accurate detection and matching of feature points across adjacent viewpoints
      Based on deep learningPeng et al.792018/IEEEUsing networks with deep network optical flow registration for alignment registration
      Prabhakar et al.812019/IEEEA CNN-based fast,scalable image deghosting method
      Deng et al.822020/SPIEA multi-scale contextual attention guided alignment network(CAHDRNet)
      Yan et al.832020/IEEEAn attention-guided end-to-end deep neural net-work(AHDRNet)
      Yan et al.842022/International Journal of Computer VisionA dual-attention-guided end-to-end deep neural network(DAHDRNet)
    • Table 3. Summary of static multi-exposure image sequences

      View table

      Table 3. Summary of static multi-exposure image sequences

      Image setSizeNumberImage originImage setSizeNumberImage origin
      Arno339×5213Kede Ma25Laurenziana512×3563Kede Ma25
      Balloons339×5219Erik.Reinhard85Lighthouse340×5123HDRsoft86
      Belgium house384×5219Dani Lischinski87Mask800×12003HDRsoft86
      Cafe247×3713Fei Kou34Office340×5126Matlab88
      Candle364×5126Kede Ma86Preschool247×3713Fei Kou34
      Cave384×5124Kede Ma86Room467×7003Pangeasoft89
      Chinese garden340×5123Kede Ma86Set341×5123Kede Ma25
      Church335×5123Jianbbing Shen28Sports centre247×3713Fei Kou34
      Farmhouse341×5123HDR Projects90Tower512×3413Kede Ma7
      House500×7524Mertens91Tree408×2713Fei Kou34
      Kluki341×5123Kede Ma86Venice341×5123HDRsoft86
      Laurenziana512×3563Kede Ma25Window384×5123Kede Ma25
      Lighthouse340×5123HDRsoft86Yellow hall339×5123Kede Ma25
    • Table 4. Summary of dynamic multi-exposure image sequences

      View table

      Table 4. Summary of dynamic multi-exposure image sequences

      Image setSizeNumberImage originImage setSizeNumberImage origin
      Arch1024×6695Orazio Gallo92Office768×10243Kede Ma58
      Brunswick683×10243Fabrizio Pece5Prof.JeonEigth681×10247Zhengguo Li17
      Campus648×10116Wei Zhang36Puppets1024×8125Orazio Gallo92
      Cliffs1683×10243Fabrizio Pece5Russ1683×10243Fabrizio Pece5
      Forrest683×10244Orazio Gallo92Sculpture garden754×10245Orazio Gallo92
      Horse690×10243Kang71Square683×10243Fabrizio Pece5
      Lady1024×6863Jun Hu93Tate3683×10243Fabrizio Pece5
      Noise camera480×64010Orazio Gallo92Wroclav683×10243Fabrizio Pece5
      Noise camera480×64010Orazio Gallo92YWFusionopolis681×10246Zhengguo Li17
    • Table 5. Summary of performance evaluation indicators of multi-exposure image fusion method for static scenes

      View table

      Table 5. Summary of performance evaluation indicators of multi-exposure image fusion method for static scenes

      MethodMEF-SSIMMIPSNRSDHeQAB/FNIQE
      Raman et al.6
      Li et al.7
      Lee et al.8
      Xu et al.9
      Ulucan et al.10
      Li et al.11
      Kinoshita et al.12
      Wang et al.13
      Goshtasby et al.14
      Ma et al.15
      Huang et al.16
      Li et al.17
      Li et al.18
      Li et al.19
      Wang et al.20
      Shen et al.21
      Li et al.22
      Liu et al.24
      Ma et al.25
      Burt et al.26
      Mertens et al.27
      Shen et al.28
      Li et al.29
      Yan et al.30
      Li et al.31
      Singh et al.32
      Wang et al.33
      Kou et al.34
      Yang et al.35
      Zhang et al.39
      Paul et al.40
      Liu et al.41
      Gu et al.42
      Shao et al.44
      Wang et al.43
      Yang et al.45
      Kalantari et al.46
      Li et al.47
      Liu et al.48
      Chen et al.49
      Cai et al.50
      Prabhakar et al.51
      Qi et al.52
      Xu et al.56
      Gao et al.57
      Yang et al.59
    • Table 6. Summary of performance evaluation indicators of multi-exposure image fusion method for dynamic scenes

      View table

      Table 6. Summary of performance evaluation indicators of multi-exposure image fusion method for dynamic scenes

      MethodMEF-SSIMMIPSNRSDHeQAB/FNIQE
      Khan et al.63
      Jacobs et al.64
      Pece et al.5
      Zhang et al.65
      Zhang et al.66
      Ma et al.67
      Hu et al.68
      Liu et al.69
      Ye et al.70
      Kang et al.71
      Hu et al.72
      Zimmer et al.73
      Sie et al.74
      Ulucan et al.75
      Xue et al.76
      Trinidad et al.77
      Khan et al.78
      Peng et al.79
      Prabhakar et al.81
      Deng et al.82
      Yan et al.83
      Yan et al.84
    Tools

    Get Citation

    Copy Citation Text

    Xinli Zhu, Yasheng Zhang, Yuqiang Fang, Xitao Zhang, Jieping Xu, Di Luo. Review of Multi-Exposure Image Fusion Methods[J]. Laser & Optoelectronics Progress, 2023, 60(22): 2200003

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Reviews

    Received: Mar. 17, 2023

    Accepted: Apr. 3, 2023

    Published Online: Nov. 3, 2023

    The Author Email: Yuqiang Fang (fangyuqiang@nudt.edu.cn)

    DOI:10.3788/LOP230683

    Topics