File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.neucom.2024.127469
- Scopus: eid_2-s2.0-85187173299
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: Semi-supervised domain adaptation on graphs with contrastive learning and minimax entropy
Title | Semi-supervised domain adaptation on graphs with contrastive learning and minimax entropy |
---|---|
Authors | |
Keywords | Adversarial learning Graph contrastive learning Graph transfer learning Node classification Semi-supervised domain adaptation |
Issue Date | 1-May-2024 |
Publisher | Elsevier |
Citation | Neurocomputing, 2024, v. 580 How to Cite? |
Abstract | Label scarcity in a graph is frequently encountered in real-world applications due to the high cost of data labeling. To this end, semi-supervised domain adaptation (SSDA) on graphs aims to leverage the knowledge of a labeled source graph to aid in node classification on a target graph with limited labels. SSDA tasks need to overcome the domain gap between the source and target graphs. However, to date, this challenging research problem has yet to be formally considered by the existing approaches designed for cross-graph node classification. This paper proposes a novel method called SemiGCL to tackle the graph Semi-supervised domain adaptation with Graph Contrastive Learning and minimax entropy training. SemiGCL generates informative node representations by contrasting the representations learned from a graph's local and global views. Additionally, SemiGCL is adversarially optimized with the entropy loss of unlabeled target nodes to reduce domain divergence. Experimental results on benchmark datasets demonstrate that SemiGCL outperforms the state-of-the-art baselines on the SSDA tasks. |
Persistent Identifier | http://hdl.handle.net/10722/344367 |
ISSN | 2023 Impact Factor: 5.5 2023 SCImago Journal Rankings: 1.815 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Xiao, Jiaren | - |
dc.contributor.author | Dai, Quanyu | - |
dc.contributor.author | Shen, Xiao | - |
dc.contributor.author | Xie, Xiaochen | - |
dc.contributor.author | Dai, Jing | - |
dc.contributor.author | Lam, James | - |
dc.contributor.author | Kwok, Ka Wai | - |
dc.date.accessioned | 2024-07-24T13:51:02Z | - |
dc.date.available | 2024-07-24T13:51:02Z | - |
dc.date.issued | 2024-05-01 | - |
dc.identifier.citation | Neurocomputing, 2024, v. 580 | - |
dc.identifier.issn | 0925-2312 | - |
dc.identifier.uri | http://hdl.handle.net/10722/344367 | - |
dc.description.abstract | Label scarcity in a graph is frequently encountered in real-world applications due to the high cost of data labeling. To this end, semi-supervised domain adaptation (SSDA) on graphs aims to leverage the knowledge of a labeled source graph to aid in node classification on a target graph with limited labels. SSDA tasks need to overcome the domain gap between the source and target graphs. However, to date, this challenging research problem has yet to be formally considered by the existing approaches designed for cross-graph node classification. This paper proposes a novel method called SemiGCL to tackle the graph Semi-supervised domain adaptation with Graph Contrastive Learning and minimax entropy training. SemiGCL generates informative node representations by contrasting the representations learned from a graph's local and global views. Additionally, SemiGCL is adversarially optimized with the entropy loss of unlabeled target nodes to reduce domain divergence. Experimental results on benchmark datasets demonstrate that SemiGCL outperforms the state-of-the-art baselines on the SSDA tasks. | - |
dc.language | eng | - |
dc.publisher | Elsevier | - |
dc.relation.ispartof | Neurocomputing | - |
dc.subject | Adversarial learning | - |
dc.subject | Graph contrastive learning | - |
dc.subject | Graph transfer learning | - |
dc.subject | Node classification | - |
dc.subject | Semi-supervised domain adaptation | - |
dc.title | Semi-supervised domain adaptation on graphs with contrastive learning and minimax entropy | - |
dc.type | Article | - |
dc.identifier.doi | 10.1016/j.neucom.2024.127469 | - |
dc.identifier.scopus | eid_2-s2.0-85187173299 | - |
dc.identifier.volume | 580 | - |
dc.identifier.eissn | 1872-8286 | - |
dc.identifier.issnl | 0925-2312 | - |