File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: EEGNN: Edge Enhanced Graph Neural Network with a Bayesian Nonparametric Graph Model
Title | EEGNN: Edge Enhanced Graph Neural Network with a Bayesian Nonparametric Graph Model |
---|---|
Authors | |
Issue Date | 2023 |
Citation | Proceedings of Machine Learning Research, 2023, v. 206, p. 2132-2146 How to Cite? |
Abstract | Training deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may suffer from the number of hidden message-passing layers. The literature has focused on the proposals of over-smoothing and under-reaching to explain the performance deterioration of deep GNNs. In this paper, we propose a new explanation for such deteriorated performance phenomenon, mis-simplification, that is, mistakenly simplifying graphs by preventing self-loops and forcing edges to be unweighted. We show that such simplifying can reduce the potential of message-passing layers to capture the structural information of graphs. In view of this, we propose a new framework, edge enhanced graph neural network (EEGNN). EEGNN uses the structural information extracted from the proposed Dirichlet mixture Poisson graph model (DMPGM), a Bayesian nonparametric model for graphs, to improve the performance of various deep message-passing GNNs. We propose a Markov chain Monte Carlo inference framework for DMPGM. Experiments over different datasets show that our method achieves considerable performance increase compared to baselines. |
Persistent Identifier | http://hdl.handle.net/10722/336388 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Liu, Yirui | - |
dc.contributor.author | Qiao, Xinghao | - |
dc.contributor.author | Wang, Liying | - |
dc.contributor.author | Lam, Jessica | - |
dc.date.accessioned | 2024-01-15T08:26:25Z | - |
dc.date.available | 2024-01-15T08:26:25Z | - |
dc.date.issued | 2023 | - |
dc.identifier.citation | Proceedings of Machine Learning Research, 2023, v. 206, p. 2132-2146 | - |
dc.identifier.uri | http://hdl.handle.net/10722/336388 | - |
dc.description.abstract | Training deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may suffer from the number of hidden message-passing layers. The literature has focused on the proposals of over-smoothing and under-reaching to explain the performance deterioration of deep GNNs. In this paper, we propose a new explanation for such deteriorated performance phenomenon, mis-simplification, that is, mistakenly simplifying graphs by preventing self-loops and forcing edges to be unweighted. We show that such simplifying can reduce the potential of message-passing layers to capture the structural information of graphs. In view of this, we propose a new framework, edge enhanced graph neural network (EEGNN). EEGNN uses the structural information extracted from the proposed Dirichlet mixture Poisson graph model (DMPGM), a Bayesian nonparametric model for graphs, to improve the performance of various deep message-passing GNNs. We propose a Markov chain Monte Carlo inference framework for DMPGM. Experiments over different datasets show that our method achieves considerable performance increase compared to baselines. | - |
dc.language | eng | - |
dc.relation.ispartof | Proceedings of Machine Learning Research | - |
dc.title | EEGNN: Edge Enhanced Graph Neural Network with a Bayesian Nonparametric Graph Model | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85165165285 | - |
dc.identifier.volume | 206 | - |
dc.identifier.spage | 2132 | - |
dc.identifier.epage | 2146 | - |
dc.identifier.eissn | 2640-3498 | - |