Spacecraft Recovery & Remote Sensing, Volume. 45, Issue 3, 51(2024)
Application Analysis of Object-Level (OL) Spatio-Temporal Fusion Model in NDVI and LST —— Taking Dali Area as an Example
In the past decades, spatio-temporal fusion technology has provided an economical and efficient method to realize long-time series observation, but this method has a weak ability to retain structural information and low computational efficiency. In this study, the difference of fusion effect and fusion efficiency between mainstream spatiotemporal fusion methods (including STARFM, ESTARFM, Fit-FC, FSDAF) and Object-level spatiotemporal fusion methods in normalized vegetation index (NDVI) and land surface temperature (LST) is compared and analyzed. In this paper, Dali area is taken as the research area, and nine spatio-temporal fusion methods are used to fuse Landsat and MODIS data, and the differences in spatio-temporal simulation effect and calculation efficiency are evaluated through visual discrimination and statistical analysis. Experiments show that: 1) Compared with other spatio-temporal fusion methods, OL-FSDAF2.0 can better restore the real information and structural information of the surface; 2) The computational efficiency of object-level spatiotemporal fusion method is 20.7081 times higher than other pixel-level spatiotemporal fusion methods; 3) Object-level spatio-temporal fusion method has higher ability to capture the details of temporal dynamic features of ground objects than pixel-level spatio-temporal fusion method. Generally speaking, the object-level spatio-temporal fusion method has higher computational efficiency and more accurate fusion effect, among which OL-FSDAF2.0 performs well in complex surface areas and simulated dynamic changes of surface cover.
Get Citation
Copy Citation Text
Yongle GAO, Jinsheng CHANG, Yongchong YANG, Tao WANG. Application Analysis of Object-Level (OL) Spatio-Temporal Fusion Model in NDVI and LST —— Taking Dali Area as an Example[J]. Spacecraft Recovery & Remote Sensing, 2024, 45(3): 51
Category:
Received: Jul. 27, 2023
Accepted: --
Published Online: Oct. 30, 2024
The Author Email: YANG Yongchong (363195405@qq.com)