File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: One Transistor One Electrolyte‐Gated Transistor Based Spiking Neural Network for Power‐Efficient Neuromorphic Computing System

TitleOne Transistor One Electrolyte‐Gated Transistor Based Spiking Neural Network for Power‐Efficient Neuromorphic Computing System
Authors
Keywordsassociative memory
electrolyte-gated transistors
ion intercalation
neuromorphic computing
spiking neural networks
Issue Date2021
PublisherWiley-VCH Verlag GmbH & Co KGaA. The Journal's web site is located at http://www.wiley-vch.de/home/afm
Citation
Advanced Functional Materials, 2021, v. 31 n. 26, p. article no. 2100042 How to Cite?
AbstractNeuromorphic computing powered by spiking neural networks (SNN) provides a powerful and efficient information processing paradigm. To harvest the advantage of SNNs, compact and low-power synapses that can reliably practice local learning rules are required, posing significant challenges to the conventional silicon-based platform in terms of area- and energy-efficiency, as well as computing throughput. Here, electrolyte-gated transistors (EGTs) paired with transistors are employed to implement power-efficient neuromorphic computing systems. The one-transistor-one-EGT (1T1E) synapse not only alleviates the self-discharging of EGT but also provides a flexible and efficient way to practice the important spike-timing-dependent plasticity learning rule. Based on that, an SNN with a temporal coding scheme is implemented for associative memory that can learn and recover images of handwritten digits with high robustness. Thanks to the temporal coding scheme and low operation current of EGTs, the energy-efficiency of 1T1E-based SNN is ≈30× lower than that of the prevalent rate coding scheme, and the peak performance is estimated to be 2 pJ/SOP (picojoule per synaptic operation) at the training phase and 80 TOPs−1 W−1 (tera operations per second per watt) at inference phase, respectively. These results pave the way for power-efficient neuromorphic computing systems with wide applications for edge computing.
Persistent Identifierhttp://hdl.handle.net/10722/306223
ISSN
2021 Impact Factor: 19.924
2020 SCImago Journal Rankings: 6.069
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLi, Y-
dc.contributor.authorXuan, Z-
dc.contributor.authorLu, J-
dc.contributor.authorWang, Z-
dc.contributor.authorZhang, X-
dc.contributor.authorWu, Z-
dc.contributor.authorWang, Y-
dc.contributor.authorXu, H-
dc.contributor.authorDou, C-
dc.contributor.authorKang, Y-
dc.contributor.authorLiu, Q-
dc.contributor.authorLv, H-
dc.contributor.authorShang, D-
dc.date.accessioned2021-10-20T10:20:33Z-
dc.date.available2021-10-20T10:20:33Z-
dc.date.issued2021-
dc.identifier.citationAdvanced Functional Materials, 2021, v. 31 n. 26, p. article no. 2100042-
dc.identifier.issn1616-301X-
dc.identifier.urihttp://hdl.handle.net/10722/306223-
dc.description.abstractNeuromorphic computing powered by spiking neural networks (SNN) provides a powerful and efficient information processing paradigm. To harvest the advantage of SNNs, compact and low-power synapses that can reliably practice local learning rules are required, posing significant challenges to the conventional silicon-based platform in terms of area- and energy-efficiency, as well as computing throughput. Here, electrolyte-gated transistors (EGTs) paired with transistors are employed to implement power-efficient neuromorphic computing systems. The one-transistor-one-EGT (1T1E) synapse not only alleviates the self-discharging of EGT but also provides a flexible and efficient way to practice the important spike-timing-dependent plasticity learning rule. Based on that, an SNN with a temporal coding scheme is implemented for associative memory that can learn and recover images of handwritten digits with high robustness. Thanks to the temporal coding scheme and low operation current of EGTs, the energy-efficiency of 1T1E-based SNN is ≈30× lower than that of the prevalent rate coding scheme, and the peak performance is estimated to be 2 pJ/SOP (picojoule per synaptic operation) at the training phase and 80 TOPs−1 W−1 (tera operations per second per watt) at inference phase, respectively. These results pave the way for power-efficient neuromorphic computing systems with wide applications for edge computing.-
dc.languageeng-
dc.publisherWiley-VCH Verlag GmbH & Co KGaA. The Journal's web site is located at http://www.wiley-vch.de/home/afm-
dc.relation.ispartofAdvanced Functional Materials-
dc.rightsSubmitted (preprint) Version This is the pre-peer reviewed version of the following article: [FULL CITE], which has been published in final form at [Link to final article using the DOI]. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Use of Self-Archived Versions. Accepted (peer-reviewed) Version This is the peer reviewed version of the following article: [FULL CITE], which has been published in final form at [Link to final article using the DOI]. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Use of Self-Archived Versions.-
dc.subjectassociative memory-
dc.subjectelectrolyte-gated transistors-
dc.subjection intercalation-
dc.subjectneuromorphic computing-
dc.subjectspiking neural networks-
dc.titleOne Transistor One Electrolyte‐Gated Transistor Based Spiking Neural Network for Power‐Efficient Neuromorphic Computing System-
dc.typeArticle-
dc.identifier.emailWang, Z: zrwang@eee.hku.hk-
dc.identifier.authorityWang, Z=rp02714-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1002/adfm.202100042-
dc.identifier.scopuseid_2-s2.0-85104332960-
dc.identifier.hkuros327772-
dc.identifier.volume31-
dc.identifier.issue26-
dc.identifier.spagearticle no. 2100042-
dc.identifier.epagearticle no. 2100042-
dc.identifier.isiWOS:000640921600001-
dc.publisher.placeGermany-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats