File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1145/3477495.3532009
- Scopus: eid_2-s2.0-85130072361
- WOS: WOS:000852715901048
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Knowledge Graph Contrastive Learning for Recommendation
| Title | Knowledge Graph Contrastive Learning for Recommendation |
|---|---|
| Authors | |
| Keywords | knowledge graph recommendation self-supervised learning |
| Issue Date | 2022 |
| Citation | SIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2022, p. 1434-1443 How to Cite? |
| Abstract | Knowledge Graphs (KGs) have been utilized as useful side information to improve recommendation quality. In those recommender systems, knowledge graph information often contains fruitful facts and inherent semantic relatedness among items. However, the success of such methods relies on the high quality knowledge graphs, and may not learn quality representations with two challenges: i) The long-tail distribution of entities results in sparse supervision signals for KG-enhanced item representation; ii) Real-world knowledge graphs are often noisy and contain topic-irrelevant connections between items and entities. Such KG sparsity and noise make the item-entity dependent relations deviate from reflecting their true characteristics, which significantly amplifies the noise effect and hinders the accurate representation of user's preference. To fill this research gap, we design a general Knowledge Graph Contrastive Learning framework (KGCL) that alleviates the information noise for knowledge graph-enhanced recommender systems. Specifically, we propose a knowledge graph augmentation schema to suppress KG noise in information aggregation, and derive more robust knowledge-aware representations for items. In addition, we exploit additional supervision signals from the KG augmentation process to guide a cross-view contrastive learning paradigm, giving a greater role to unbiased user-item interactions in gradient descent and further suppressing the noise. Extensive experiments on three public datasets demonstrate the consistent superiority of our KGCL over state-of-the-art techniques. KGCL also achieves strong performance in recommendation scenarios with sparse user-item interactions, long-tail and noisy KG entities. Our implementation codes are available at https: //github.com/yuh-yang/KGCL-SIGIR22. |
| Persistent Identifier | http://hdl.handle.net/10722/355923 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Yang, Yuhao | - |
| dc.contributor.author | Huang, Chao | - |
| dc.contributor.author | Xia, Lianghao | - |
| dc.contributor.author | Li, Chenliang | - |
| dc.date.accessioned | 2025-05-19T05:46:41Z | - |
| dc.date.available | 2025-05-19T05:46:41Z | - |
| dc.date.issued | 2022 | - |
| dc.identifier.citation | SIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2022, p. 1434-1443 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/355923 | - |
| dc.description.abstract | Knowledge Graphs (KGs) have been utilized as useful side information to improve recommendation quality. In those recommender systems, knowledge graph information often contains fruitful facts and inherent semantic relatedness among items. However, the success of such methods relies on the high quality knowledge graphs, and may not learn quality representations with two challenges: i) The long-tail distribution of entities results in sparse supervision signals for KG-enhanced item representation; ii) Real-world knowledge graphs are often noisy and contain topic-irrelevant connections between items and entities. Such KG sparsity and noise make the item-entity dependent relations deviate from reflecting their true characteristics, which significantly amplifies the noise effect and hinders the accurate representation of user's preference. To fill this research gap, we design a general Knowledge Graph Contrastive Learning framework (KGCL) that alleviates the information noise for knowledge graph-enhanced recommender systems. Specifically, we propose a knowledge graph augmentation schema to suppress KG noise in information aggregation, and derive more robust knowledge-aware representations for items. In addition, we exploit additional supervision signals from the KG augmentation process to guide a cross-view contrastive learning paradigm, giving a greater role to unbiased user-item interactions in gradient descent and further suppressing the noise. Extensive experiments on three public datasets demonstrate the consistent superiority of our KGCL over state-of-the-art techniques. KGCL also achieves strong performance in recommendation scenarios with sparse user-item interactions, long-tail and noisy KG entities. Our implementation codes are available at https: //github.com/yuh-yang/KGCL-SIGIR22. | - |
| dc.language | eng | - |
| dc.relation.ispartof | SIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval | - |
| dc.subject | knowledge graph | - |
| dc.subject | recommendation | - |
| dc.subject | self-supervised learning | - |
| dc.title | Knowledge Graph Contrastive Learning for Recommendation | - |
| dc.type | Conference_Paper | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1145/3477495.3532009 | - |
| dc.identifier.scopus | eid_2-s2.0-85130072361 | - |
| dc.identifier.spage | 1434 | - |
| dc.identifier.epage | 1443 | - |
| dc.identifier.isi | WOS:000852715901048 | - |
