File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1038/s42256-023-00609-5
- Scopus: eid_2-s2.0-85147953508
- WOS: WOS:000930411100001
Supplementary
- Citations:
- Appears in Collections:
Article: Echo state graph neural networks with analogue random resistive memory arrays
Title | Echo state graph neural networks with analogue random resistive memory arrays |
---|---|
Authors | |
Issue Date | 13-Feb-2023 |
Publisher | Springer Nature |
Citation | Nature Machine Intelligence, 2023, v. 5, n. 2, p. 104-113 How to Cite? |
Abstract | Recent years have witnessed a surge of interest in learning representations of graph-structured data, with applications from social networks to drug discovery. However, graph neural networks, the machine learning models for handling graph-structured data, face significant challenges when running on conventional digital hardware, including the slowdown of Moore’s law due to transistor scaling limits and the von Neumann bottleneck incurred by physically separated memory and processing units, as well as a high training cost. Here we present a hardware–software co-design to address these challenges, by designing an echo state graph neural network based on random resistive memory arrays, which are built from low-cost, nanoscale and stackable resistors for efficient in-memory computing. This approach leverages the intrinsic stochasticity of dielectric breakdown in resistive switching to implement random projections in hardware for an echo state network that effectively minimizes the training complexity thanks to its fixed and random weights. The system demonstrates state-of-the-art performance on both graph classification using the MUTAG and COLLAB datasets and node classification using the CORA dataset, achieving 2.16×, 35.42× and 40.37× improvements in energy efficiency for a projected random resistive memory-based hybrid analogue–digital system over a state-of-the-art graphics processing unit and 99.35%, 99.99% and 91.40% reductions of backward pass complexity compared with conventional graph learning. The results point to a promising direction for next-generation artificial intelligence systems for graph learning. |
Persistent Identifier | http://hdl.handle.net/10722/340487 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Wang, SC | - |
dc.contributor.author | Li, Y | - |
dc.contributor.author | Wang, DC | - |
dc.contributor.author | Zhang, WY | - |
dc.contributor.author | Chen, X | - |
dc.contributor.author | Dong, DN | - |
dc.contributor.author | Wang, SQ | - |
dc.contributor.author | Zhang, XM | - |
dc.contributor.author | Lin, P | - |
dc.contributor.author | Gallicchio, C | - |
dc.contributor.author | Xu, XX | - |
dc.contributor.author | Liu, Q | - |
dc.contributor.author | Cheng, KT | - |
dc.contributor.author | Wang, ZR | - |
dc.contributor.author | Shang, DS | - |
dc.contributor.author | Liu, M | - |
dc.date.accessioned | 2024-03-11T10:45:00Z | - |
dc.date.available | 2024-03-11T10:45:00Z | - |
dc.date.issued | 2023-02-13 | - |
dc.identifier.citation | Nature Machine Intelligence, 2023, v. 5, n. 2, p. 104-113 | - |
dc.identifier.uri | http://hdl.handle.net/10722/340487 | - |
dc.description.abstract | <p> <span>Recent years have witnessed a surge of interest in learning representations of graph-structured data, with applications from social networks to drug discovery. However, graph neural networks, the machine learning models for handling graph-structured data, face significant challenges when running on conventional digital hardware, including the slowdown of Moore’s law due to transistor scaling limits and the von Neumann bottleneck incurred by physically separated memory and processing units, as well as a high training cost. Here we present a hardware–software co-design to address these challenges, by designing an echo state graph neural network based on random resistive memory arrays, which are built from low-cost, nanoscale and stackable resistors for efficient in-memory computing. This approach leverages the intrinsic stochasticity of dielectric breakdown in resistive switching to implement random projections in hardware for an echo state network that effectively minimizes the training complexity thanks to its fixed and random weights. The system demonstrates state-of-the-art performance on both graph classification using the MUTAG and COLLAB datasets and node classification using the CORA dataset, achieving 2.16×, 35.42× and 40.37× improvements in energy efficiency for a projected random resistive memory-based hybrid analogue–digital system over a state-of-the-art graphics processing unit and 99.35%, 99.99% and 91.40% reductions of backward pass complexity compared with conventional graph learning. The results point to a promising direction for next-generation artificial intelligence systems for graph learning.</span> <br></p> | - |
dc.language | eng | - |
dc.publisher | Springer Nature | - |
dc.relation.ispartof | Nature Machine Intelligence | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.title | Echo state graph neural networks with analogue random resistive memory arrays | - |
dc.type | Article | - |
dc.identifier.doi | 10.1038/s42256-023-00609-5 | - |
dc.identifier.scopus | eid_2-s2.0-85147953508 | - |
dc.identifier.volume | 5 | - |
dc.identifier.issue | 2 | - |
dc.identifier.spage | 104 | - |
dc.identifier.epage | 113 | - |
dc.identifier.eissn | 2522-5839 | - |
dc.identifier.isi | WOS:000930411100001 | - |
dc.identifier.issnl | 2522-5839 | - |