File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: An enhanced unmixing model for spatiotemporal image fusion

TitleAn enhanced unmixing model for spatiotemporal image fusion
Authors
KeywordsChange detection
Remote sensing images
Spatial and temporal fusion
Spatial resolution
Spatial unmixing of pixels
Temporal resolution
Issue Date2021
Citation
National Remote Sensing Bulletin, 2021, v. 25, n. 1, p. 241-250 How to Cite?
AbstractRemote sensing images with high spatial and temporal resolutions are vital for the real-time and fine monitoring of land surface and atmospheric environment. However, a single satellite sensor has to tradeoff between the spatial and temporal resolutions due to technical and budget limitations. In recent years, numerous spatial and temporal image fusion models have been proposed to produce high-resolution images with low cost and remarkable effectiveness. Despite the varying levels of success in the accuracy of fused images and the efficiency of algorithms, challenges always remain on the recovery of spatial details along with the complex land cover changes. This study presented an enhanced unmixing model for spatial and temporal image fusion (EUSTFM) that accounts for phenological changes (e.g., vegetation growth) and shape (e.g., urban expansion) and non-shape land cover changes (e.g., crop rotation) on the land surface simultaneously. First, a change detection method was devised to identify the pixels with land cover change. The similar pixels of the detected pixels were then searched in the neighborhood to recompose the spectral reflectance on the prediction date. Thus, the real land cover class on the prediction date can be defined using the recomposed high-resolution image rather than directly using the classification result from a prior date. Subsequently, the spatial unmixing of pixels can be conducted on the prior and prediction dates to produce a medium-resolution image pair with accurate spatial details. Finally, the calculation of the similar pixels in the neighborhood was implemented for the final prediction of the fused images using all the original high and low-resolution image pair in the prior time, low-resolution image in the prediction time, and the produced medium-resolution image pair in the prior and prediction times. This study tested the algorithms with two actual Landsat-MODIS datasets: one dataset focusing on typical phenological changes in a complex landscape in Australia and the other dataset focusing on shape land cover changes in Shenzhen, China, to demonstrate the performance of the proposed EUSTFM for complex temporal changes on various landscapes. Comparisons with the popular spatiotemporal fusion models, including Spatial and Temporal Adaptive Reference Fusion Model (STARFM) and Flexible Spatiotemporal DAta Fusion (FSDAF), showed that EUSTFM can robustly achieve a better fusion accuracy for all the phenological, non-shape, and shape land cover changes. The fused results using STARFM and FSDAF showed significant differences between the green band and the two other bands for typical phenological changes on a complex landscape in Australia. By contrast, the fused images using EUSTFM showed consistently high accuracy in all the three bands. This finding revealed a better performance for the fusion of images with various spatial resolution gaps, including a factor of 8 in near-infrared and red bands and a factor of 16 in the green bands. The proposed EUSTFM shows great potential in facilitating the monitoring of complex and diverse land surface dynamics.
Persistent Identifierhttp://hdl.handle.net/10722/329697
ISSN
2023 SCImago Journal Rankings: 0.521

 

DC FieldValueLanguage
dc.contributor.authorHuang, Bo-
dc.contributor.authorJiang, Xiaolu-
dc.date.accessioned2023-08-09T03:34:40Z-
dc.date.available2023-08-09T03:34:40Z-
dc.date.issued2021-
dc.identifier.citationNational Remote Sensing Bulletin, 2021, v. 25, n. 1, p. 241-250-
dc.identifier.issn1007-4619-
dc.identifier.urihttp://hdl.handle.net/10722/329697-
dc.description.abstractRemote sensing images with high spatial and temporal resolutions are vital for the real-time and fine monitoring of land surface and atmospheric environment. However, a single satellite sensor has to tradeoff between the spatial and temporal resolutions due to technical and budget limitations. In recent years, numerous spatial and temporal image fusion models have been proposed to produce high-resolution images with low cost and remarkable effectiveness. Despite the varying levels of success in the accuracy of fused images and the efficiency of algorithms, challenges always remain on the recovery of spatial details along with the complex land cover changes. This study presented an enhanced unmixing model for spatial and temporal image fusion (EUSTFM) that accounts for phenological changes (e.g., vegetation growth) and shape (e.g., urban expansion) and non-shape land cover changes (e.g., crop rotation) on the land surface simultaneously. First, a change detection method was devised to identify the pixels with land cover change. The similar pixels of the detected pixels were then searched in the neighborhood to recompose the spectral reflectance on the prediction date. Thus, the real land cover class on the prediction date can be defined using the recomposed high-resolution image rather than directly using the classification result from a prior date. Subsequently, the spatial unmixing of pixels can be conducted on the prior and prediction dates to produce a medium-resolution image pair with accurate spatial details. Finally, the calculation of the similar pixels in the neighborhood was implemented for the final prediction of the fused images using all the original high and low-resolution image pair in the prior time, low-resolution image in the prediction time, and the produced medium-resolution image pair in the prior and prediction times. This study tested the algorithms with two actual Landsat-MODIS datasets: one dataset focusing on typical phenological changes in a complex landscape in Australia and the other dataset focusing on shape land cover changes in Shenzhen, China, to demonstrate the performance of the proposed EUSTFM for complex temporal changes on various landscapes. Comparisons with the popular spatiotemporal fusion models, including Spatial and Temporal Adaptive Reference Fusion Model (STARFM) and Flexible Spatiotemporal DAta Fusion (FSDAF), showed that EUSTFM can robustly achieve a better fusion accuracy for all the phenological, non-shape, and shape land cover changes. The fused results using STARFM and FSDAF showed significant differences between the green band and the two other bands for typical phenological changes on a complex landscape in Australia. By contrast, the fused images using EUSTFM showed consistently high accuracy in all the three bands. This finding revealed a better performance for the fusion of images with various spatial resolution gaps, including a factor of 8 in near-infrared and red bands and a factor of 16 in the green bands. The proposed EUSTFM shows great potential in facilitating the monitoring of complex and diverse land surface dynamics.-
dc.languageeng-
dc.relation.ispartofNational Remote Sensing Bulletin-
dc.subjectChange detection-
dc.subjectRemote sensing images-
dc.subjectSpatial and temporal fusion-
dc.subjectSpatial resolution-
dc.subjectSpatial unmixing of pixels-
dc.subjectTemporal resolution-
dc.titleAn enhanced unmixing model for spatiotemporal image fusion-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.11834/jrs.20210459-
dc.identifier.scopuseid_2-s2.0-85103168767-
dc.identifier.volume25-
dc.identifier.issue1-
dc.identifier.spage241-
dc.identifier.epage250-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats