File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Long short-term memory networks in memristor crossbar arrays

TitleLong short-term memory networks in memristor crossbar arrays
Authors
Issue Date2019
Citation
Nature Machine Intelligence, 2019, v. 1, n. 1, p. 49-57 How to Cite?
Abstract© 2019, The Author(s), under exclusive licence to Springer Nature Limited. Recent breakthroughs in recurrent deep neural networks with long short-term memory (LSTM) units have led to major advances in artificial intelligence. However, state-of-the-art LSTM models with significantly increased complexity and a large number of parameters have a bottleneck in computing power resulting from both limited memory capacity and limited data communication bandwidth. Here we demonstrate experimentally that the synaptic weights shared in different time steps in an LSTM can be implemented with a memristor crossbar array, which has a small circuit footprint, can store a large number of parameters and offers in-memory computing capability that contributes to circumventing the ‘von Neumann bottleneck’. We illustrate the capability of our crossbar system as a core component in solving real-world problems in regression and classification, which shows that memristor LSTM is a promising low-power and low-latency hardware platform for edge inference.
Persistent Identifierhttp://hdl.handle.net/10722/286812
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLi, Can-
dc.contributor.authorWang, Zhongrui-
dc.contributor.authorRao, Mingyi-
dc.contributor.authorBelkin, Daniel-
dc.contributor.authorSong, Wenhao-
dc.contributor.authorJiang, Hao-
dc.contributor.authorYan, Peng-
dc.contributor.authorLi, Yunning-
dc.contributor.authorLin, Peng-
dc.contributor.authorHu, Miao-
dc.contributor.authorGe, Ning-
dc.contributor.authorStrachan, John Paul-
dc.contributor.authorBarnell, Mark-
dc.contributor.authorWu, Qing-
dc.contributor.authorWilliams, R. Stanley-
dc.contributor.authorYang, J. Joshua-
dc.contributor.authorXia, Qiangfei-
dc.date.accessioned2020-09-07T11:45:44Z-
dc.date.available2020-09-07T11:45:44Z-
dc.date.issued2019-
dc.identifier.citationNature Machine Intelligence, 2019, v. 1, n. 1, p. 49-57-
dc.identifier.urihttp://hdl.handle.net/10722/286812-
dc.description.abstract© 2019, The Author(s), under exclusive licence to Springer Nature Limited. Recent breakthroughs in recurrent deep neural networks with long short-term memory (LSTM) units have led to major advances in artificial intelligence. However, state-of-the-art LSTM models with significantly increased complexity and a large number of parameters have a bottleneck in computing power resulting from both limited memory capacity and limited data communication bandwidth. Here we demonstrate experimentally that the synaptic weights shared in different time steps in an LSTM can be implemented with a memristor crossbar array, which has a small circuit footprint, can store a large number of parameters and offers in-memory computing capability that contributes to circumventing the ‘von Neumann bottleneck’. We illustrate the capability of our crossbar system as a core component in solving real-world problems in regression and classification, which shows that memristor LSTM is a promising low-power and low-latency hardware platform for edge inference.-
dc.languageeng-
dc.relation.ispartofNature Machine Intelligence-
dc.titleLong short-term memory networks in memristor crossbar arrays-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1038/s42256-018-0001-4-
dc.identifier.scopuseid_2-s2.0-85088419164-
dc.identifier.volume1-
dc.identifier.issue1-
dc.identifier.spage49-
dc.identifier.epage57-
dc.identifier.eissn2522-5839-
dc.identifier.isiWOS:000566947600012-
dc.identifier.issnl2522-5839-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats