File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1145/3583780.3614917
- Scopus: eid_2-s2.0-85178121542
- WOS: WOS:001161549500020
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: How Expressive are Graph Neural Networks in Recommendation?
| Title | How Expressive are Graph Neural Networks in Recommendation? |
|---|---|
| Authors | |
| Keywords | Graph Neural Networks Recommender Systems |
| Issue Date | 2023 |
| Citation | International Conference on Information and Knowledge Management, Proceedings, 2023, p. 173-182 How to Cite? |
| Abstract | Graph Neural Networks (GNNs) have demonstrated superior performance in various graph learning tasks, including recommendation, where they explore user-item collaborative filtering signals within graphs. However, despite their empirical effectiveness in state-of-the-art recommender models, theoretical formulations of their capability are scarce. Recently, researchers have explored the expressiveness of GNNs, demonstrating that message passing GNNs are at most as powerful as the Weisfeiler-Lehman test, and that GNNs combined with random node initialization are universal. Nevertheless, the concept of "expressiveness" for GNNs remains vaguely defined. Most existing works adopt the graph isomorphism test as the metric of expressiveness, but this graph-level task may not effectively assess a model's ability in recommendation, where the objective is to distinguish nodes of different closeness. In this paper, we provide a comprehensive theoretical analysis of the expressiveness of GNNs in recommendation, considering three levels of expressiveness metrics: graph isomorphism (graph-level), node automorphism (node-level), and topological closeness (link-level). We propose the topological closeness metric to evaluate GNNs' ability to capture the structural distance between nodes, which closely aligns with the recommendation objective. To validate the effectiveness of this new metric in evaluating recommendation performance, we introduce a learning-less GNN algorithm that is optimal on the new metric and can be optimal on the node-level metric with suitable modification. We conduct extensive experiments comparing the proposed algorithm against various types of state-of-the-art GNN models to explore the effectiveness of the new metric in the recommendation task. For the sake of reproducibility, implementation codes are available at https://github.com/HKUDS/GTE. |
| Persistent Identifier | http://hdl.handle.net/10722/355959 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Cai, Xuheng | - |
| dc.contributor.author | Xia, Lianghao | - |
| dc.contributor.author | Ren, Xubin | - |
| dc.contributor.author | Huang, Chao | - |
| dc.date.accessioned | 2025-05-19T05:46:54Z | - |
| dc.date.available | 2025-05-19T05:46:54Z | - |
| dc.date.issued | 2023 | - |
| dc.identifier.citation | International Conference on Information and Knowledge Management, Proceedings, 2023, p. 173-182 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/355959 | - |
| dc.description.abstract | Graph Neural Networks (GNNs) have demonstrated superior performance in various graph learning tasks, including recommendation, where they explore user-item collaborative filtering signals within graphs. However, despite their empirical effectiveness in state-of-the-art recommender models, theoretical formulations of their capability are scarce. Recently, researchers have explored the expressiveness of GNNs, demonstrating that message passing GNNs are at most as powerful as the Weisfeiler-Lehman test, and that GNNs combined with random node initialization are universal. Nevertheless, the concept of "expressiveness" for GNNs remains vaguely defined. Most existing works adopt the graph isomorphism test as the metric of expressiveness, but this graph-level task may not effectively assess a model's ability in recommendation, where the objective is to distinguish nodes of different closeness. In this paper, we provide a comprehensive theoretical analysis of the expressiveness of GNNs in recommendation, considering three levels of expressiveness metrics: graph isomorphism (graph-level), node automorphism (node-level), and topological closeness (link-level). We propose the topological closeness metric to evaluate GNNs' ability to capture the structural distance between nodes, which closely aligns with the recommendation objective. To validate the effectiveness of this new metric in evaluating recommendation performance, we introduce a learning-less GNN algorithm that is optimal on the new metric and can be optimal on the node-level metric with suitable modification. We conduct extensive experiments comparing the proposed algorithm against various types of state-of-the-art GNN models to explore the effectiveness of the new metric in the recommendation task. For the sake of reproducibility, implementation codes are available at https://github.com/HKUDS/GTE. | - |
| dc.language | eng | - |
| dc.relation.ispartof | International Conference on Information and Knowledge Management, Proceedings | - |
| dc.subject | Graph Neural Networks | - |
| dc.subject | Recommender Systems | - |
| dc.title | How Expressive are Graph Neural Networks in Recommendation? | - |
| dc.type | Conference_Paper | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1145/3583780.3614917 | - |
| dc.identifier.scopus | eid_2-s2.0-85178121542 | - |
| dc.identifier.spage | 173 | - |
| dc.identifier.epage | 182 | - |
| dc.identifier.isi | WOS:001161549500020 | - |
