File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Echo state graph neural networks with analogue random resistive memory arrays

TitleEcho state graph neural networks with analogue random resistive memory arrays
Authors
Issue Date13-Feb-2023
PublisherSpringer Nature
Citation
Nature Machine Intelligence, 2023, v. 5, n. 2, p. 104-113 How to Cite?
Abstract

Recent years have witnessed a surge of interest in learning representations of graph-structured data, with applications from social networks to drug discovery. However, graph neural networks, the machine learning models for handling graph-structured data, face significant challenges when running on conventional digital hardware, including the slowdown of Moore’s law due to transistor scaling limits and the von Neumann bottleneck incurred by physically separated memory and processing units, as well as a high training cost. Here we present a hardware–software co-design to address these challenges, by designing an echo state graph neural network based on random resistive memory arrays, which are built from low-cost, nanoscale and stackable resistors for efficient in-memory computing. This approach leverages the intrinsic stochasticity of dielectric breakdown in resistive switching to implement random projections in hardware for an echo state network that effectively minimizes the training complexity thanks to its fixed and random weights. The system demonstrates state-of-the-art performance on both graph classification using the MUTAG and COLLAB datasets and node classification using the CORA dataset, achieving 2.16×, 35.42× and 40.37× improvements in energy efficiency for a projected random resistive memory-based hybrid analogue–digital system over a state-of-the-art graphics processing unit and 99.35%, 99.99% and 91.40% reductions of backward pass complexity compared with conventional graph learning. The results point to a promising direction for next-generation artificial intelligence systems for graph learning.


Persistent Identifierhttp://hdl.handle.net/10722/340487
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorWang, SC-
dc.contributor.authorLi, Y-
dc.contributor.authorWang, DC-
dc.contributor.authorZhang, WY-
dc.contributor.authorChen, X-
dc.contributor.authorDong, DN-
dc.contributor.authorWang, SQ-
dc.contributor.authorZhang, XM-
dc.contributor.authorLin, P-
dc.contributor.authorGallicchio, C-
dc.contributor.authorXu, XX-
dc.contributor.authorLiu, Q-
dc.contributor.authorCheng, KT-
dc.contributor.authorWang, ZR-
dc.contributor.authorShang, DS-
dc.contributor.authorLiu, M-
dc.date.accessioned2024-03-11T10:45:00Z-
dc.date.available2024-03-11T10:45:00Z-
dc.date.issued2023-02-13-
dc.identifier.citationNature Machine Intelligence, 2023, v. 5, n. 2, p. 104-113-
dc.identifier.urihttp://hdl.handle.net/10722/340487-
dc.description.abstract<p> <span>Recent years have witnessed a surge of interest in learning representations of graph-structured data, with applications from social networks to drug discovery. However, graph neural networks, the machine learning models for handling graph-structured data, face significant challenges when running on conventional digital hardware, including the slowdown of Moore’s law due to transistor scaling limits and the von Neumann bottleneck incurred by physically separated memory and processing units, as well as a high training cost. Here we present a hardware–software co-design to address these challenges, by designing an echo state graph neural network based on random resistive memory arrays, which are built from low-cost, nanoscale and stackable resistors for efficient in-memory computing. This approach leverages the intrinsic stochasticity of dielectric breakdown in resistive switching to implement random projections in hardware for an echo state network that effectively minimizes the training complexity thanks to its fixed and random weights. The system demonstrates state-of-the-art performance on both graph classification using the MUTAG and COLLAB datasets and node classification using the CORA dataset, achieving 2.16×, 35.42× and 40.37× improvements in energy efficiency for a projected random resistive memory-based hybrid analogue–digital system over a state-of-the-art graphics processing unit and 99.35%, 99.99% and 91.40% reductions of backward pass complexity compared with conventional graph learning. The results point to a promising direction for next-generation artificial intelligence systems for graph learning.</span> <br></p>-
dc.languageeng-
dc.publisherSpringer Nature-
dc.relation.ispartofNature Machine Intelligence-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.titleEcho state graph neural networks with analogue random resistive memory arrays -
dc.typeArticle-
dc.identifier.doi10.1038/s42256-023-00609-5-
dc.identifier.scopuseid_2-s2.0-85147953508-
dc.identifier.volume5-
dc.identifier.issue2-
dc.identifier.spage104-
dc.identifier.epage113-
dc.identifier.eissn2522-5839-
dc.identifier.isiWOS:000930411100001-
dc.identifier.issnl2522-5839-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats