File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Multi-Behavior Hypergraph-Enhanced Transformer for Sequential Recommendation

TitleMulti-Behavior Hypergraph-Enhanced Transformer for Sequential Recommendation
Authors
Keywordsgraph neural networks
hypergraph learning
multi-behavior recommendation
sequential recommendation
Issue Date2022
Citation
Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2022, p. 2263-2274 How to Cite?
AbstractLearning dynamic user preference has become an increasingly important component for many online platforms (e.g., video-sharing sites, e-commerce systems) to make sequential recommendations. Previous works have made many efforts to model item-item transitions over user interaction sequences, based on various architectures, e.g., recurrent neural networks and self-attention mechanism. Recently emerged graph neural networks also serve as useful backbone models to capture item dependencies in sequential recommendation scenarios. Despite their effectiveness, existing methods have far focused on item sequence representation with singular type of interactions, and thus are limited to capture dynamic heterogeneous relational structures between users and items (e.g., page view, add-to-favorite, purchase). To tackle this challenge, we design a Multi-Behavior Hypergraph-enhanced T ransformer framework (MBHT) to capture both short-term and long-term cross-type behavior dependencies. Specifically, a multi-scale Transformer is equipped with low-rank self-attention to jointly encode behavior-aware sequential patterns from fine-grained and coarse-grained levels. Additionally,we incorporate the global multi-behavior dependency into the hypergraph neural architecture to capture the hierarchical long-range item correlations in a customized manner. Experimental results demonstrate the superiority of our MBHT over various state-of- the-art recommendation solutions across different settings. Further ablation studies validate the effectiveness of our model design and benefits of the new MBHT framework. Our implementation code is released at: https://github.com/yuh-yang/MBHT-KDD22.
Persistent Identifierhttp://hdl.handle.net/10722/355928
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorYang, Yuhao-
dc.contributor.authorHuang, Chao-
dc.contributor.authorXia, Lianghao-
dc.contributor.authorLiang, Yuxuan-
dc.contributor.authorYu, Yanwei-
dc.contributor.authorLi, Chenliang-
dc.date.accessioned2025-05-19T05:46:43Z-
dc.date.available2025-05-19T05:46:43Z-
dc.date.issued2022-
dc.identifier.citationProceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2022, p. 2263-2274-
dc.identifier.urihttp://hdl.handle.net/10722/355928-
dc.description.abstractLearning dynamic user preference has become an increasingly important component for many online platforms (e.g., video-sharing sites, e-commerce systems) to make sequential recommendations. Previous works have made many efforts to model item-item transitions over user interaction sequences, based on various architectures, e.g., recurrent neural networks and self-attention mechanism. Recently emerged graph neural networks also serve as useful backbone models to capture item dependencies in sequential recommendation scenarios. Despite their effectiveness, existing methods have far focused on item sequence representation with singular type of interactions, and thus are limited to capture dynamic heterogeneous relational structures between users and items (e.g., page view, add-to-favorite, purchase). To tackle this challenge, we design a Multi-Behavior Hypergraph-enhanced T ransformer framework (MBHT) to capture both short-term and long-term cross-type behavior dependencies. Specifically, a multi-scale Transformer is equipped with low-rank self-attention to jointly encode behavior-aware sequential patterns from fine-grained and coarse-grained levels. Additionally,we incorporate the global multi-behavior dependency into the hypergraph neural architecture to capture the hierarchical long-range item correlations in a customized manner. Experimental results demonstrate the superiority of our MBHT over various state-of- the-art recommendation solutions across different settings. Further ablation studies validate the effectiveness of our model design and benefits of the new MBHT framework. Our implementation code is released at: https://github.com/yuh-yang/MBHT-KDD22.-
dc.languageeng-
dc.relation.ispartofProceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining-
dc.subjectgraph neural networks-
dc.subjecthypergraph learning-
dc.subjectmulti-behavior recommendation-
dc.subjectsequential recommendation-
dc.titleMulti-Behavior Hypergraph-Enhanced Transformer for Sequential Recommendation-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1145/3534678.3539342-
dc.identifier.scopuseid_2-s2.0-85137147497-
dc.identifier.spage2263-
dc.identifier.epage2274-
dc.identifier.isiWOS:001119000302032-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats