File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1080/2150704X.2013.769283
- Scopus: eid_2-s2.0-85008814963
- WOS: WOS:000316087000005
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Unified fusion of remote-sensing imagery: Generating simultaneously high-resolution synthetic spatial–temporal–spectral earth observations
Title | Unified fusion of remote-sensing imagery: Generating simultaneously high-resolution synthetic spatial–temporal–spectral earth observations |
---|---|
Authors | |
Issue Date | 2013 |
Citation | Remote Sensing Letters, 2013, v. 4, n. 6, p. 561-569 How to Cite? |
Abstract | Current satellite remote-sensing systems compromise between spatial resolution and spectral and/or temporal resolution, which potentially limits the use of remotely sensed data in various applications. Image fusion processes, including spatial and spectral fusion (SSF) and spatial and temporal fusion (STF), provide powerful tools for addressing these technological limitations. Although SSF and STF have been extensively studied separately, they have not yet been integrated into a unified framework to generate synthetic satellite images with high spatial, temporal and spectral resolution. By formulating these two types of fusion into one general problem, i.e. super resolving a low spatial resolution image with a high spatial resolution image acquired under different conditions (e.g. at different times and/or in different acquisition bands), this letter proposes a notion of unified fusion that can accomplish both SSF and STF in one process. A Bayesian framework is subsequently developed to implement SSF, STF and unified fusion to generate ‘virtual sensor’ data, characterized by high spatial, temporal and spectral resolution simultaneously. The proposed method was then applied to the fusion of Moderate Resolution Imaging Spectroradiometer (MODIS) and Landsat Enhanced Thematic Mapper Plus (ETM+) images of the Hong Kong area, with the average spatial correlation coefficient exceeding 0.9 for near infrared–red— green bands between the fused result and the input Landsat image and with good preservation of the MODIS spectral properties. © 2013 Taylor & Francis Group, LLC. All rights reserved. |
Persistent Identifier | http://hdl.handle.net/10722/329429 |
ISSN | 2023 Impact Factor: 1.4 2023 SCImago Journal Rankings: 0.458 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Huang, Bo | - |
dc.contributor.author | Zhang, Hankui | - |
dc.contributor.author | Song, Huihui | - |
dc.contributor.author | Wang, Juan | - |
dc.contributor.author | Song, Chunqiao | - |
dc.date.accessioned | 2023-08-09T03:32:43Z | - |
dc.date.available | 2023-08-09T03:32:43Z | - |
dc.date.issued | 2013 | - |
dc.identifier.citation | Remote Sensing Letters, 2013, v. 4, n. 6, p. 561-569 | - |
dc.identifier.issn | 2150-704X | - |
dc.identifier.uri | http://hdl.handle.net/10722/329429 | - |
dc.description.abstract | Current satellite remote-sensing systems compromise between spatial resolution and spectral and/or temporal resolution, which potentially limits the use of remotely sensed data in various applications. Image fusion processes, including spatial and spectral fusion (SSF) and spatial and temporal fusion (STF), provide powerful tools for addressing these technological limitations. Although SSF and STF have been extensively studied separately, they have not yet been integrated into a unified framework to generate synthetic satellite images with high spatial, temporal and spectral resolution. By formulating these two types of fusion into one general problem, i.e. super resolving a low spatial resolution image with a high spatial resolution image acquired under different conditions (e.g. at different times and/or in different acquisition bands), this letter proposes a notion of unified fusion that can accomplish both SSF and STF in one process. A Bayesian framework is subsequently developed to implement SSF, STF and unified fusion to generate ‘virtual sensor’ data, characterized by high spatial, temporal and spectral resolution simultaneously. The proposed method was then applied to the fusion of Moderate Resolution Imaging Spectroradiometer (MODIS) and Landsat Enhanced Thematic Mapper Plus (ETM+) images of the Hong Kong area, with the average spatial correlation coefficient exceeding 0.9 for near infrared–red— green bands between the fused result and the input Landsat image and with good preservation of the MODIS spectral properties. © 2013 Taylor & Francis Group, LLC. All rights reserved. | - |
dc.language | eng | - |
dc.relation.ispartof | Remote Sensing Letters | - |
dc.title | Unified fusion of remote-sensing imagery: Generating simultaneously high-resolution synthetic spatial–temporal–spectral earth observations | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1080/2150704X.2013.769283 | - |
dc.identifier.scopus | eid_2-s2.0-85008814963 | - |
dc.identifier.volume | 4 | - |
dc.identifier.issue | 6 | - |
dc.identifier.spage | 561 | - |
dc.identifier.epage | 569 | - |
dc.identifier.eissn | 2150-7058 | - |
dc.identifier.isi | WOS:000316087000005 | - |