File Download

There are no files associated with this item.

Supplementary

Conference Paper: Augmenting Message Passing by Retrieving Similar Graphs

TitleAugmenting Message Passing by Retrieving Similar Graphs
Authors
Issue Date1-Jul-2023
Abstract

Graph Neural Networks (GNNs) are effective tools for graph representation learning. Most GNNs rely on a recursive neighborhood aggregation scheme, named message passing, thereby their theoretical expressive power is limited to the first order Weisfeiler-Lehman test (1-WL). Motivated by the success of retrieval-based models and off-the-shelf high-performance retrieval systems, we propose a non-parametric and modelagnostic scheme called GRAPHRETRIEVAL to boost existing GNN models. In GRAPHRETRIEVAL, similar training graphs associated with their ground-truth labels are retrieved as an enhancement to be jointly utilized with the input graph representation to complete various graph property predictive tasks. In particular, to effectively “absorb” useful information from retrieved graphs and “ignore” possible noise introduced by potentially irrelevant graphs, we introduce an adapter based on self-attention to explicitly learn the interaction between an input graph and its retrieved similar graphs. By experimenting with three classic GNN models on 12 different datasets, we have demonstrated GRAPHRETRIEVAL is able to bring substantial improvements to existing GNN models without comprising the model size and the prediction efficiency. Our work also firstly validates the feasibility and effectiveness of retrieved-enhanced graph neural networks


Persistent Identifierhttp://hdl.handle.net/10722/340253

 

DC FieldValueLanguage
dc.contributor.authorWang, Dingmin-
dc.contributor.authorLiu, Shengchao-
dc.contributor.authorWang, Hanchen-
dc.contributor.authorSong, Linfeng-
dc.contributor.authorTang, Jian-
dc.contributor.authorLe, Song-
dc.contributor.authorGrau, Bernardo Cuenca-
dc.contributor.authorLiu, Qi-
dc.date.accessioned2024-03-11T10:42:48Z-
dc.date.available2024-03-11T10:42:48Z-
dc.date.issued2023-07-01-
dc.identifier.urihttp://hdl.handle.net/10722/340253-
dc.description.abstract<p>Graph Neural Networks (GNNs) are effective tools for graph representation learning. Most GNNs rely on a recursive neighborhood aggregation scheme, named message passing, thereby their theoretical expressive power is limited to the first order Weisfeiler-Lehman test (1-WL). Motivated by the success of retrieval-based models and off-the-shelf high-performance retrieval systems, we propose a non-parametric and modelagnostic scheme called GRAPHRETRIEVAL to boost existing GNN models. In GRAPHRETRIEVAL, similar training graphs associated with their ground-truth labels are retrieved as an enhancement to be jointly utilized with the input graph representation to complete various graph property predictive tasks. In particular, to effectively “absorb” useful information from retrieved graphs and “ignore” possible noise introduced by potentially irrelevant graphs, we introduce an adapter based on self-attention to explicitly learn the interaction between an input graph and its retrieved similar graphs. By experimenting with three classic GNN models on 12 different datasets, we have demonstrated GRAPHRETRIEVAL is able to bring substantial improvements to existing GNN models without comprising the model size and the prediction efficiency. Our work also firstly validates the feasibility and effectiveness of retrieved-enhanced graph neural networks<br></p>-
dc.languageeng-
dc.relation.ispartofEuropean Conference on Artificial Intelligence-
dc.titleAugmenting Message Passing by Retrieving Similar Graphs-
dc.typeConference_Paper-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats