Laser & Optoelectronics Progress, Volume. 61, Issue 8, 0828001(2024)

Classification Method of Remote Sensing Image Based on Dynamic Weight Transform and Dual Network Self Verification

Qingfang Zhang1, Ming Cong1、*, Ling Han1, Jiangbo Xi1, Qingqing Jing2, Jianjun Cui1, Chengsheng Yang1, Chaofeng Ren1, Junkai Gu1, Miaozhong Xu3, and Yiting Tao3
Author Affiliations
  • 1College of Geology Engineering and Geomatics, Chang'an University, Xi'an 710054, Shaanxi, Chian
  • 2China Aero Geophysical Survey & Remote Sensing Center for Land and Resources, Beijing 100083, China
  • 3State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, Hubei, China
  • show less
    Figures & Tables(15)
    The basic structure of dynamic weight deformation
    The application of point multiplication of dynamic weight deformation
    Diamond feature extraction structure based on residual connection
    Classification network structure with dynamic deformation of multi-diamond weights
    Schematic diagram of YOLO network improved by weight dynamic deformation
    Schematic diagram of the dual model network structure
    Experiment 1. (a) Original remote sensing image; (b) feature category composition; (c) GT; (d) results of FCN; (e) results of Attention-Unet; (f) results of MASK-RCNN; (g) results of DWTCN; (h) results of DWTDN
    Experiment 2. (a) Original remote sensing image; (b) feature category composition; (c) GT; (d) results of FCN; (e) results of Attention-Unet; (f) results of MASK-RCNN; (g) results of DWTCN; (h) results of DWTDN
    Experiment 3. (a) Original remote sensing image; (b) feature category composition; (c) GT; (d) results of FCN; (e) results of Attention-Unet; (f) results of MASK-RCNN; (g) results of DWTCN; (h) results of DWTDN
    Experiment 4. (a) Original remote sensing image; (b) feature category composition; (c) GT; (d) results of FCN; (e) results of Attention-Unet; (f) results of MASK-RCNN; (g) results of DWTCN; (h) results of DWTDN
    • Table 1. Classification accuracy of experiment 1

      View table

      Table 1. Classification accuracy of experiment 1

      Classification accuracy of different networksFCNAttention-UnetMASK-RCNNDWTCNDWTDN
      OAKappaOAKappaOAKappaOAKappaOAKappa
      villa0.840.270.880.360.910.430.840.280.900.42
      building0.740.180.860.350.830.320.890.250.870.35
      vegetation0.840.700.940.690.940.710.830.670.950.75
      road0.760.160.570.220.570.250.880.180.670.32
      water0.880.970.980.960.980.970.980.960.990.98
      bare land0.770.350.670.400.660.400.770.360.670.41
      overall0.720.600.700.570.790.690.730.610.820.72
    • Table 2. Classification accuracy of experiment 2

      View table

      Table 2. Classification accuracy of experiment 2

      Classification accuracy of different networksFCNAttention-UnetMASK-RCNNDWTCDDWTDN
      OAKappaOAKappaOAKappaOAKappaOAKappa
      building0.670.200.540.150.620.280.640.100.700.31
      vegetation0.820.270.860.410.880.450.890.410.900.48
      greenhouses0.730.450.820.650.800.600.780.570.830.67
      bare land0.700.390.720.340.710.330.740.470.720.34
      overall0.600.380.690.510.660.470.660.480.710.52
    • Table 3. Classification accuracy of experiment 3

      View table

      Table 3. Classification accuracy of experiment 3

      Classification accuracy of different networksFCNAttention-UnetMASK-RCNNDWTCNDWTDN
      OAKappaOAKappaOAKappaOAKappaOAKappa
      building0.930.590.900.430.910.540.920.610.930.60
      vegetation0.880.530.890.520.890.520.880.550.890.54
      water0.490.140.610.240.670.260.480.060.690.28
      greenhouses0.850.640.890.730.900.750.890.710.910.76
      bare land0.720.480.700.560.740.570.750.520.750.57
      overall0.710.550.660.510.680.530.710.560.720.56
    • Table 4. Classification accuracy of experiment 4

      View table

      Table 4. Classification accuracy of experiment 4

      Classification accuracy of different networksFCNAttention-UnetMASK-RCNNDWTCNDWTDN
      OAKappaOAKappaOAKappaOAKappaOAKappa
      building0.690.430.810.630.830.670.720.490.840.68
      vegetation0.800.560.830.640.840.650.810.570.850.68
      road0.520.050.590.250.610.290.620.180.660.32
      bare land0.760.010.770.430.770.430.760.010.770.43
      overall0.590.370.630.480.650.490.620.420.700.53
    • Table 5. Classification accuracy of all experiments

      View table

      Table 5. Classification accuracy of all experiments

      Classification accuracy of different networksFCNAttention-UnetMASK-RCNNDWTCDDWTDN
      OAKappaOAKappaOAKappaOAKappaOAKappa
      Exp.10.720.600.700.570.790.690.730.610.820.72
      Exp.20.600.380.690.510.660.470.660.480.710.52
      Exp.30.710.550.660.510.680.530.710.560.720.56
      Exp.40.590.370.630.480.650.490.620.420.700.53
    Tools

    Get Citation

    Copy Citation Text

    Qingfang Zhang, Ming Cong, Ling Han, Jiangbo Xi, Qingqing Jing, Jianjun Cui, Chengsheng Yang, Chaofeng Ren, Junkai Gu, Miaozhong Xu, Yiting Tao. Classification Method of Remote Sensing Image Based on Dynamic Weight Transform and Dual Network Self Verification[J]. Laser & Optoelectronics Progress, 2024, 61(8): 0828001

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Remote Sensing and Sensors

    Received: May. 26, 2023

    Accepted: Jul. 24, 2023

    Published Online: Mar. 15, 2024

    The Author Email: Cong Ming (mingc@chd.edu.cn)

    DOI:10.3788/LOP231381

    Topics