File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Augmenting Message Passing by Retrieving Similar Graphs
Title | Augmenting Message Passing by Retrieving Similar Graphs |
---|---|
Authors | |
Issue Date | 1-Jul-2023 |
Abstract | Graph Neural Networks (GNNs) are effective tools for graph representation learning. Most GNNs rely on a recursive neighborhood aggregation scheme, named message passing, thereby their theoretical expressive power is limited to the first order Weisfeiler-Lehman test (1-WL). Motivated by the success of retrieval-based models and off-the-shelf high-performance retrieval systems, we propose a non-parametric and modelagnostic scheme called GRAPHRETRIEVAL to boost existing GNN models. In GRAPHRETRIEVAL, similar training graphs associated with their ground-truth labels are retrieved as an enhancement to be jointly utilized with the input graph representation to complete various graph property predictive tasks. In particular, to effectively “absorb” useful information from retrieved graphs and “ignore” possible noise introduced by potentially irrelevant graphs, we introduce an adapter based on self-attention to explicitly learn the interaction between an input graph and its retrieved similar graphs. By experimenting with three classic GNN models on 12 different datasets, we have demonstrated GRAPHRETRIEVAL is able to bring substantial improvements to existing GNN models without comprising the model size and the prediction efficiency. Our work also firstly validates the feasibility and effectiveness of retrieved-enhanced graph neural networks |
Persistent Identifier | http://hdl.handle.net/10722/340253 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Wang, Dingmin | - |
dc.contributor.author | Liu, Shengchao | - |
dc.contributor.author | Wang, Hanchen | - |
dc.contributor.author | Song, Linfeng | - |
dc.contributor.author | Tang, Jian | - |
dc.contributor.author | Le, Song | - |
dc.contributor.author | Grau, Bernardo Cuenca | - |
dc.contributor.author | Liu, Qi | - |
dc.date.accessioned | 2024-03-11T10:42:48Z | - |
dc.date.available | 2024-03-11T10:42:48Z | - |
dc.date.issued | 2023-07-01 | - |
dc.identifier.uri | http://hdl.handle.net/10722/340253 | - |
dc.description.abstract | <p>Graph Neural Networks (GNNs) are effective tools for graph representation learning. Most GNNs rely on a recursive neighborhood aggregation scheme, named message passing, thereby their theoretical expressive power is limited to the first order Weisfeiler-Lehman test (1-WL). Motivated by the success of retrieval-based models and off-the-shelf high-performance retrieval systems, we propose a non-parametric and modelagnostic scheme called GRAPHRETRIEVAL to boost existing GNN models. In GRAPHRETRIEVAL, similar training graphs associated with their ground-truth labels are retrieved as an enhancement to be jointly utilized with the input graph representation to complete various graph property predictive tasks. In particular, to effectively “absorb” useful information from retrieved graphs and “ignore” possible noise introduced by potentially irrelevant graphs, we introduce an adapter based on self-attention to explicitly learn the interaction between an input graph and its retrieved similar graphs. By experimenting with three classic GNN models on 12 different datasets, we have demonstrated GRAPHRETRIEVAL is able to bring substantial improvements to existing GNN models without comprising the model size and the prediction efficiency. Our work also firstly validates the feasibility and effectiveness of retrieved-enhanced graph neural networks<br></p> | - |
dc.language | eng | - |
dc.relation.ispartof | European Conference on Artificial Intelligence | - |
dc.title | Augmenting Message Passing by Retrieving Similar Graphs | - |
dc.type | Conference_Paper | - |