File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TGRS.2024.3397740
- Scopus: eid_2-s2.0-85192736577
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: Progressive Self-Supervised Pretraining for Hyperspectral Image Classification
Title | Progressive Self-Supervised Pretraining for Hyperspectral Image Classification |
---|---|
Authors | |
Keywords | Hyperspectral image (HSI) classification limited labels self-supervised learning (SSL) transfer learning |
Issue Date | 6-May-2024 |
Publisher | IEEE |
Citation | IEEE Transactions on Geoscience and Remote Sensing, 2024, v. 62, p. 1-13 How to Cite? |
Abstract | Self-supervised learning has demonstrated considerable success in hyperspectral image (HSI) classification when limited labeled data are available. However, inherent dissimilarities among HSIs require self-supervised pretraining from scratch for each HSI dataset. Pretraining on a large amount of unlabeled data can be time-consuming. In addition, the poor quality of some HSIs can limit the performance of self-supervised learning (SSL) algorithms. To address these issues, we propose to enhance self-supervised pretraining on HSIs with transfer learning. We introduce a progressive self-supervised pretraining (PSP) framework that acquires strong initialization for the final pretraining on the target HSI dataset by sequentially performing self-supervised pretraining on datasets that are increasingly similar to the target HSI, specifically, first on a large general vision dataset and then on a related HSI dataset. This sequential strategy enables the model to progressively learn from domain-general vision knowledge to target-specific hyperspectral knowledge. To mitigate the catastrophic forgetting in sequential training, we develop a regularization method, called self-supervised elastic weight consolidation (SS-EWC), to impose adaptive constraints on the changes to model parameters. Thorough classification experiments on various HSI datasets demonstrate that our framework significantly and consistently improves the self-supervised pretraining on HSIs in terms of both convergence speed and representation quality. Furthermore, our framework exhibits high generalizability and can be applied to various SSL algorithms. Transfer learning continues to prove its usefulness in self-supervised settings. |
Persistent Identifier | http://hdl.handle.net/10722/350906 |
ISSN | 2023 Impact Factor: 7.5 2023 SCImago Journal Rankings: 2.403 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Guan, Peiyan | - |
dc.contributor.author | Lam, Edmund Y. | - |
dc.date.accessioned | 2024-11-06T00:30:34Z | - |
dc.date.available | 2024-11-06T00:30:34Z | - |
dc.date.issued | 2024-05-06 | - |
dc.identifier.citation | IEEE Transactions on Geoscience and Remote Sensing, 2024, v. 62, p. 1-13 | - |
dc.identifier.issn | 0196-2892 | - |
dc.identifier.uri | http://hdl.handle.net/10722/350906 | - |
dc.description.abstract | Self-supervised learning has demonstrated considerable success in hyperspectral image (HSI) classification when limited labeled data are available. However, inherent dissimilarities among HSIs require self-supervised pretraining from scratch for each HSI dataset. Pretraining on a large amount of unlabeled data can be time-consuming. In addition, the poor quality of some HSIs can limit the performance of self-supervised learning (SSL) algorithms. To address these issues, we propose to enhance self-supervised pretraining on HSIs with transfer learning. We introduce a progressive self-supervised pretraining (PSP) framework that acquires strong initialization for the final pretraining on the target HSI dataset by sequentially performing self-supervised pretraining on datasets that are increasingly similar to the target HSI, specifically, first on a large general vision dataset and then on a related HSI dataset. This sequential strategy enables the model to progressively learn from domain-general vision knowledge to target-specific hyperspectral knowledge. To mitigate the catastrophic forgetting in sequential training, we develop a regularization method, called self-supervised elastic weight consolidation (SS-EWC), to impose adaptive constraints on the changes to model parameters. Thorough classification experiments on various HSI datasets demonstrate that our framework significantly and consistently improves the self-supervised pretraining on HSIs in terms of both convergence speed and representation quality. Furthermore, our framework exhibits high generalizability and can be applied to various SSL algorithms. Transfer learning continues to prove its usefulness in self-supervised settings. | - |
dc.language | eng | - |
dc.publisher | IEEE | - |
dc.relation.ispartof | IEEE Transactions on Geoscience and Remote Sensing | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.subject | Hyperspectral image (HSI) classification | - |
dc.subject | limited labels | - |
dc.subject | self-supervised learning (SSL) | - |
dc.subject | transfer learning | - |
dc.title | Progressive Self-Supervised Pretraining for Hyperspectral Image Classification | - |
dc.type | Article | - |
dc.identifier.doi | 10.1109/TGRS.2024.3397740 | - |
dc.identifier.scopus | eid_2-s2.0-85192736577 | - |
dc.identifier.volume | 62 | - |
dc.identifier.spage | 1 | - |
dc.identifier.epage | 13 | - |
dc.identifier.eissn | 1558-0644 | - |
dc.identifier.issnl | 0196-2892 | - |