File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Unified fusion of remote-sensing imagery: Generating simultaneously high-resolution synthetic spatial–temporal–spectral earth observations

TitleUnified fusion of remote-sensing imagery: Generating simultaneously high-resolution synthetic spatial–temporal–spectral earth observations
Authors
Issue Date2013
Citation
Remote Sensing Letters, 2013, v. 4, n. 6, p. 561-569 How to Cite?
AbstractCurrent satellite remote-sensing systems compromise between spatial resolution and spectral and/or temporal resolution, which potentially limits the use of remotely sensed data in various applications. Image fusion processes, including spatial and spectral fusion (SSF) and spatial and temporal fusion (STF), provide powerful tools for addressing these technological limitations. Although SSF and STF have been extensively studied separately, they have not yet been integrated into a unified framework to generate synthetic satellite images with high spatial, temporal and spectral resolution. By formulating these two types of fusion into one general problem, i.e. super resolving a low spatial resolution image with a high spatial resolution image acquired under different conditions (e.g. at different times and/or in different acquisition bands), this letter proposes a notion of unified fusion that can accomplish both SSF and STF in one process. A Bayesian framework is subsequently developed to implement SSF, STF and unified fusion to generate ‘virtual sensor’ data, characterized by high spatial, temporal and spectral resolution simultaneously. The proposed method was then applied to the fusion of Moderate Resolution Imaging Spectroradiometer (MODIS) and Landsat Enhanced Thematic Mapper Plus (ETM+) images of the Hong Kong area, with the average spatial correlation coefficient exceeding 0.9 for near infrared–red— green bands between the fused result and the input Landsat image and with good preservation of the MODIS spectral properties. © 2013 Taylor & Francis Group, LLC. All rights reserved.
Persistent Identifierhttp://hdl.handle.net/10722/329429
ISSN
2023 Impact Factor: 1.4
2023 SCImago Journal Rankings: 0.458
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorHuang, Bo-
dc.contributor.authorZhang, Hankui-
dc.contributor.authorSong, Huihui-
dc.contributor.authorWang, Juan-
dc.contributor.authorSong, Chunqiao-
dc.date.accessioned2023-08-09T03:32:43Z-
dc.date.available2023-08-09T03:32:43Z-
dc.date.issued2013-
dc.identifier.citationRemote Sensing Letters, 2013, v. 4, n. 6, p. 561-569-
dc.identifier.issn2150-704X-
dc.identifier.urihttp://hdl.handle.net/10722/329429-
dc.description.abstractCurrent satellite remote-sensing systems compromise between spatial resolution and spectral and/or temporal resolution, which potentially limits the use of remotely sensed data in various applications. Image fusion processes, including spatial and spectral fusion (SSF) and spatial and temporal fusion (STF), provide powerful tools for addressing these technological limitations. Although SSF and STF have been extensively studied separately, they have not yet been integrated into a unified framework to generate synthetic satellite images with high spatial, temporal and spectral resolution. By formulating these two types of fusion into one general problem, i.e. super resolving a low spatial resolution image with a high spatial resolution image acquired under different conditions (e.g. at different times and/or in different acquisition bands), this letter proposes a notion of unified fusion that can accomplish both SSF and STF in one process. A Bayesian framework is subsequently developed to implement SSF, STF and unified fusion to generate ‘virtual sensor’ data, characterized by high spatial, temporal and spectral resolution simultaneously. The proposed method was then applied to the fusion of Moderate Resolution Imaging Spectroradiometer (MODIS) and Landsat Enhanced Thematic Mapper Plus (ETM+) images of the Hong Kong area, with the average spatial correlation coefficient exceeding 0.9 for near infrared–red— green bands between the fused result and the input Landsat image and with good preservation of the MODIS spectral properties. © 2013 Taylor & Francis Group, LLC. All rights reserved.-
dc.languageeng-
dc.relation.ispartofRemote Sensing Letters-
dc.titleUnified fusion of remote-sensing imagery: Generating simultaneously high-resolution synthetic spatial–temporal–spectral earth observations-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1080/2150704X.2013.769283-
dc.identifier.scopuseid_2-s2.0-85008814963-
dc.identifier.volume4-
dc.identifier.issue6-
dc.identifier.spage561-
dc.identifier.epage569-
dc.identifier.eissn2150-7058-
dc.identifier.isiWOS:000316087000005-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats