File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/JSTARS.2018.2797894
- Scopus: eid_2-s2.0-85042131669
- WOS: WOS:000427425000012
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Spatiotemporal Satellite Image Fusion Using Deep Convolutional Neural Networks
Title | Spatiotemporal Satellite Image Fusion Using Deep Convolutional Neural Networks |
---|---|
Authors | |
Keywords | Convolutional neural network (CNN) nonlinear mapping (NLM) spatial resolution temporal resolution |
Issue Date | 2018 |
Citation | IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2018, v. 11, n. 3, p. 821-829 How to Cite? |
Abstract | We propose a novel spatiotemporal fusion method based on deep convolutional neural networks (CNNs) under the application background of massive remote sensing data. In the training stage, we build two five-layer CNNs to deal with the problems of complicated correspondence and large spatial resolution gaps between MODIS and Landsat images. Specifically, we first learn a nonlinear mapping CNN between MODIS and low-spatial-resolution (LSR) Landsat images and then learn a super-resolution CNN between LSR Landsat and original Landsat images. In the prediction stage, instead of directly taking the outputs of CNNs as the fusion result, we design a fusion model consisting of high-pass modulation and a weighting strategy to make full use of the information in prior images. Specifically, we first map the input MODIS images to transitional images via the learned nonlinear mapping CNN and further improve the transitional images to LSR Landsat images via the fusion model; then, via the learned SR CNN, the LSR Landsat images are supersolved to transitional images, which are further improved to Landsat images via the fusion model. Compared with the previous learning-based fusion methods, mainly referring to the sparse-representation-based methods, our CNNs-based spatiotemporal method has the following advantages: 1) automatically extracting effective image features; 2) learning an end-to-end mapping between MODIS and LSR Landsat images; and 3) generating more favorable fusion results. To examine the performance of the proposed fusion method, we conduct experiments on two representative Landsat-MODIS datasets by comparing with the sparse-representation-based spatiotemporal fusion model. The quantitative evaluations on all possible prediction dates and the comparison of fusion results on one key date in both visual effect and quantitative evaluations demonstrate that the proposed method can generate more accurate fusion results. |
Persistent Identifier | http://hdl.handle.net/10722/329495 |
ISSN | 2023 Impact Factor: 4.7 2023 SCImago Journal Rankings: 1.434 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Song, Huihui | - |
dc.contributor.author | Liu, Qingshan | - |
dc.contributor.author | Wang, Guojie | - |
dc.contributor.author | Hang, Renlong | - |
dc.contributor.author | Huang, Bo | - |
dc.date.accessioned | 2023-08-09T03:33:12Z | - |
dc.date.available | 2023-08-09T03:33:12Z | - |
dc.date.issued | 2018 | - |
dc.identifier.citation | IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2018, v. 11, n. 3, p. 821-829 | - |
dc.identifier.issn | 1939-1404 | - |
dc.identifier.uri | http://hdl.handle.net/10722/329495 | - |
dc.description.abstract | We propose a novel spatiotemporal fusion method based on deep convolutional neural networks (CNNs) under the application background of massive remote sensing data. In the training stage, we build two five-layer CNNs to deal with the problems of complicated correspondence and large spatial resolution gaps between MODIS and Landsat images. Specifically, we first learn a nonlinear mapping CNN between MODIS and low-spatial-resolution (LSR) Landsat images and then learn a super-resolution CNN between LSR Landsat and original Landsat images. In the prediction stage, instead of directly taking the outputs of CNNs as the fusion result, we design a fusion model consisting of high-pass modulation and a weighting strategy to make full use of the information in prior images. Specifically, we first map the input MODIS images to transitional images via the learned nonlinear mapping CNN and further improve the transitional images to LSR Landsat images via the fusion model; then, via the learned SR CNN, the LSR Landsat images are supersolved to transitional images, which are further improved to Landsat images via the fusion model. Compared with the previous learning-based fusion methods, mainly referring to the sparse-representation-based methods, our CNNs-based spatiotemporal method has the following advantages: 1) automatically extracting effective image features; 2) learning an end-to-end mapping between MODIS and LSR Landsat images; and 3) generating more favorable fusion results. To examine the performance of the proposed fusion method, we conduct experiments on two representative Landsat-MODIS datasets by comparing with the sparse-representation-based spatiotemporal fusion model. The quantitative evaluations on all possible prediction dates and the comparison of fusion results on one key date in both visual effect and quantitative evaluations demonstrate that the proposed method can generate more accurate fusion results. | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing | - |
dc.subject | Convolutional neural network (CNN) | - |
dc.subject | nonlinear mapping (NLM) | - |
dc.subject | spatial resolution | - |
dc.subject | temporal resolution | - |
dc.title | Spatiotemporal Satellite Image Fusion Using Deep Convolutional Neural Networks | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/JSTARS.2018.2797894 | - |
dc.identifier.scopus | eid_2-s2.0-85042131669 | - |
dc.identifier.volume | 11 | - |
dc.identifier.issue | 3 | - |
dc.identifier.spage | 821 | - |
dc.identifier.epage | 829 | - |
dc.identifier.eissn | 2151-1535 | - |
dc.identifier.isi | WOS:000427425000012 | - |