File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1145/3534678.3539342
- Scopus: eid_2-s2.0-85137147497
- WOS: WOS:001119000302032
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Multi-Behavior Hypergraph-Enhanced Transformer for Sequential Recommendation
| Title | Multi-Behavior Hypergraph-Enhanced Transformer for Sequential Recommendation |
|---|---|
| Authors | |
| Keywords | graph neural networks hypergraph learning multi-behavior recommendation sequential recommendation |
| Issue Date | 2022 |
| Citation | Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2022, p. 2263-2274 How to Cite? |
| Abstract | Learning dynamic user preference has become an increasingly important component for many online platforms (e.g., video-sharing sites, e-commerce systems) to make sequential recommendations. Previous works have made many efforts to model item-item transitions over user interaction sequences, based on various architectures, e.g., recurrent neural networks and self-attention mechanism. Recently emerged graph neural networks also serve as useful backbone models to capture item dependencies in sequential recommendation scenarios. Despite their effectiveness, existing methods have far focused on item sequence representation with singular type of interactions, and thus are limited to capture dynamic heterogeneous relational structures between users and items (e.g., page view, add-to-favorite, purchase). To tackle this challenge, we design a Multi-Behavior Hypergraph-enhanced T ransformer framework (MBHT) to capture both short-term and long-term cross-type behavior dependencies. Specifically, a multi-scale Transformer is equipped with low-rank self-attention to jointly encode behavior-aware sequential patterns from fine-grained and coarse-grained levels. Additionally,we incorporate the global multi-behavior dependency into the hypergraph neural architecture to capture the hierarchical long-range item correlations in a customized manner. Experimental results demonstrate the superiority of our MBHT over various state-of- the-art recommendation solutions across different settings. Further ablation studies validate the effectiveness of our model design and benefits of the new MBHT framework. Our implementation code is released at: https://github.com/yuh-yang/MBHT-KDD22. |
| Persistent Identifier | http://hdl.handle.net/10722/355928 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Yang, Yuhao | - |
| dc.contributor.author | Huang, Chao | - |
| dc.contributor.author | Xia, Lianghao | - |
| dc.contributor.author | Liang, Yuxuan | - |
| dc.contributor.author | Yu, Yanwei | - |
| dc.contributor.author | Li, Chenliang | - |
| dc.date.accessioned | 2025-05-19T05:46:43Z | - |
| dc.date.available | 2025-05-19T05:46:43Z | - |
| dc.date.issued | 2022 | - |
| dc.identifier.citation | Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2022, p. 2263-2274 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/355928 | - |
| dc.description.abstract | Learning dynamic user preference has become an increasingly important component for many online platforms (e.g., video-sharing sites, e-commerce systems) to make sequential recommendations. Previous works have made many efforts to model item-item transitions over user interaction sequences, based on various architectures, e.g., recurrent neural networks and self-attention mechanism. Recently emerged graph neural networks also serve as useful backbone models to capture item dependencies in sequential recommendation scenarios. Despite their effectiveness, existing methods have far focused on item sequence representation with singular type of interactions, and thus are limited to capture dynamic heterogeneous relational structures between users and items (e.g., page view, add-to-favorite, purchase). To tackle this challenge, we design a Multi-Behavior Hypergraph-enhanced T ransformer framework (MBHT) to capture both short-term and long-term cross-type behavior dependencies. Specifically, a multi-scale Transformer is equipped with low-rank self-attention to jointly encode behavior-aware sequential patterns from fine-grained and coarse-grained levels. Additionally,we incorporate the global multi-behavior dependency into the hypergraph neural architecture to capture the hierarchical long-range item correlations in a customized manner. Experimental results demonstrate the superiority of our MBHT over various state-of- the-art recommendation solutions across different settings. Further ablation studies validate the effectiveness of our model design and benefits of the new MBHT framework. Our implementation code is released at: https://github.com/yuh-yang/MBHT-KDD22. | - |
| dc.language | eng | - |
| dc.relation.ispartof | Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining | - |
| dc.subject | graph neural networks | - |
| dc.subject | hypergraph learning | - |
| dc.subject | multi-behavior recommendation | - |
| dc.subject | sequential recommendation | - |
| dc.title | Multi-Behavior Hypergraph-Enhanced Transformer for Sequential Recommendation | - |
| dc.type | Conference_Paper | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1145/3534678.3539342 | - |
| dc.identifier.scopus | eid_2-s2.0-85137147497 | - |
| dc.identifier.spage | 2263 | - |
| dc.identifier.epage | 2274 | - |
| dc.identifier.isi | WOS:001119000302032 | - |
