File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Constucting a unified framework for multi-source remotely sensed data fusion

TitleConstucting a unified framework for multi-source remotely sensed data fusion
Authors
Keywordsremote sensing
Unified fusion
China HJ-1A
MODIS
spatial-temporal-spectral-angular
Issue Date2016
Citation
International Geoscience and Remote Sensing Symposium (IGARSS), 2016, v. 2016-November, p. 2574-2577 How to Cite?
AbstractRemotely sensed data fusion which blends multi-sensor observations to generate synthetic fused data, is regarded as a possible cost-effective approach to tackle with the fixed tradeoff among satellite sensors' spatial, temporal, spectral, and angular characteristics. However, previous studies mainly focus on one-to-one fusion mode, and unified fusion studies are still limited. This paper aims to construct a unified framework for multi-source remotely sensed data fusion. Results with experimental tests using remotely sensed data including China HJ-1A CCD/HSI, MCD43A1, and MCD43A4 showed that the proposed framework was able to generate synthetic fusion with simultaneous fine spatial-, temporal-, spectral-, and angular-resolutions. To be specific, the synthetic fusion can accurately capture the temporal changes while integrating the spatial details, and combine multi-angular observation information while preserving spectral fidelity. The unified fusion framework is also flexible to be extended to arbitrary optical satellites, and hold the potential utility for making full use of available remotely sensed observations.
Persistent Identifierhttp://hdl.handle.net/10722/299541
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorChen, Bin-
dc.contributor.authorHuang, Bo-
dc.contributor.authorXu, Bing-
dc.date.accessioned2021-05-21T03:34:38Z-
dc.date.available2021-05-21T03:34:38Z-
dc.date.issued2016-
dc.identifier.citationInternational Geoscience and Remote Sensing Symposium (IGARSS), 2016, v. 2016-November, p. 2574-2577-
dc.identifier.urihttp://hdl.handle.net/10722/299541-
dc.description.abstractRemotely sensed data fusion which blends multi-sensor observations to generate synthetic fused data, is regarded as a possible cost-effective approach to tackle with the fixed tradeoff among satellite sensors' spatial, temporal, spectral, and angular characteristics. However, previous studies mainly focus on one-to-one fusion mode, and unified fusion studies are still limited. This paper aims to construct a unified framework for multi-source remotely sensed data fusion. Results with experimental tests using remotely sensed data including China HJ-1A CCD/HSI, MCD43A1, and MCD43A4 showed that the proposed framework was able to generate synthetic fusion with simultaneous fine spatial-, temporal-, spectral-, and angular-resolutions. To be specific, the synthetic fusion can accurately capture the temporal changes while integrating the spatial details, and combine multi-angular observation information while preserving spectral fidelity. The unified fusion framework is also flexible to be extended to arbitrary optical satellites, and hold the potential utility for making full use of available remotely sensed observations.-
dc.languageeng-
dc.relation.ispartofInternational Geoscience and Remote Sensing Symposium (IGARSS)-
dc.subjectremote sensing-
dc.subjectUnified fusion-
dc.subjectChina HJ-1A-
dc.subjectMODIS-
dc.subjectspatial-temporal-spectral-angular-
dc.titleConstucting a unified framework for multi-source remotely sensed data fusion-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/IGARSS.2016.7729665-
dc.identifier.scopuseid_2-s2.0-85007447718-
dc.identifier.volume2016-November-
dc.identifier.spage2574-
dc.identifier.epage2577-
dc.identifier.isiWOS:000388114602165-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats