File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Knowledge Graph Contrastive Learning for Recommendation

TitleKnowledge Graph Contrastive Learning for Recommendation
Authors
Keywordsknowledge graph
recommendation
self-supervised learning
Issue Date2022
Citation
SIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2022, p. 1434-1443 How to Cite?
AbstractKnowledge Graphs (KGs) have been utilized as useful side information to improve recommendation quality. In those recommender systems, knowledge graph information often contains fruitful facts and inherent semantic relatedness among items. However, the success of such methods relies on the high quality knowledge graphs, and may not learn quality representations with two challenges: i) The long-tail distribution of entities results in sparse supervision signals for KG-enhanced item representation; ii) Real-world knowledge graphs are often noisy and contain topic-irrelevant connections between items and entities. Such KG sparsity and noise make the item-entity dependent relations deviate from reflecting their true characteristics, which significantly amplifies the noise effect and hinders the accurate representation of user's preference. To fill this research gap, we design a general Knowledge Graph Contrastive Learning framework (KGCL) that alleviates the information noise for knowledge graph-enhanced recommender systems. Specifically, we propose a knowledge graph augmentation schema to suppress KG noise in information aggregation, and derive more robust knowledge-aware representations for items. In addition, we exploit additional supervision signals from the KG augmentation process to guide a cross-view contrastive learning paradigm, giving a greater role to unbiased user-item interactions in gradient descent and further suppressing the noise. Extensive experiments on three public datasets demonstrate the consistent superiority of our KGCL over state-of-the-art techniques. KGCL also achieves strong performance in recommendation scenarios with sparse user-item interactions, long-tail and noisy KG entities. Our implementation codes are available at https: //github.com/yuh-yang/KGCL-SIGIR22.
Persistent Identifierhttp://hdl.handle.net/10722/355923
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorYang, Yuhao-
dc.contributor.authorHuang, Chao-
dc.contributor.authorXia, Lianghao-
dc.contributor.authorLi, Chenliang-
dc.date.accessioned2025-05-19T05:46:41Z-
dc.date.available2025-05-19T05:46:41Z-
dc.date.issued2022-
dc.identifier.citationSIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2022, p. 1434-1443-
dc.identifier.urihttp://hdl.handle.net/10722/355923-
dc.description.abstractKnowledge Graphs (KGs) have been utilized as useful side information to improve recommendation quality. In those recommender systems, knowledge graph information often contains fruitful facts and inherent semantic relatedness among items. However, the success of such methods relies on the high quality knowledge graphs, and may not learn quality representations with two challenges: i) The long-tail distribution of entities results in sparse supervision signals for KG-enhanced item representation; ii) Real-world knowledge graphs are often noisy and contain topic-irrelevant connections between items and entities. Such KG sparsity and noise make the item-entity dependent relations deviate from reflecting their true characteristics, which significantly amplifies the noise effect and hinders the accurate representation of user's preference. To fill this research gap, we design a general Knowledge Graph Contrastive Learning framework (KGCL) that alleviates the information noise for knowledge graph-enhanced recommender systems. Specifically, we propose a knowledge graph augmentation schema to suppress KG noise in information aggregation, and derive more robust knowledge-aware representations for items. In addition, we exploit additional supervision signals from the KG augmentation process to guide a cross-view contrastive learning paradigm, giving a greater role to unbiased user-item interactions in gradient descent and further suppressing the noise. Extensive experiments on three public datasets demonstrate the consistent superiority of our KGCL over state-of-the-art techniques. KGCL also achieves strong performance in recommendation scenarios with sparse user-item interactions, long-tail and noisy KG entities. Our implementation codes are available at https: //github.com/yuh-yang/KGCL-SIGIR22.-
dc.languageeng-
dc.relation.ispartofSIGIR 2022 - Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval-
dc.subjectknowledge graph-
dc.subjectrecommendation-
dc.subjectself-supervised learning-
dc.titleKnowledge Graph Contrastive Learning for Recommendation-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1145/3477495.3532009-
dc.identifier.scopuseid_2-s2.0-85130072361-
dc.identifier.spage1434-
dc.identifier.epage1443-
dc.identifier.isiWOS:000852715901048-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats