File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/ASICON52560.2021.9620292
- Scopus: eid_2-s2.0-85122852581
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: BATMANN: A Binarized-All-Through Memory-Augmented Neural Network for Efficient In-Memory Computing
Title | BATMANN: A Binarized-All-Through Memory-Augmented Neural Network for Efficient In-Memory Computing |
---|---|
Authors | |
Keywords | RRAM memory augmented binary neural networks in-memory computing |
Issue Date | 2021 |
Publisher | IEEE. The Journal's web site is located at https://ieeexplore.ieee.org/xpl/conhome/1000054/all-proceedings |
Citation | 2021 IEEE 14th International Conference on ASIC (ASICON 2021), Kunming, China, 26-29 October 2021, p. 1-4 How to Cite? |
Abstract | The traditional von Neumann architecture suffers from heavy data traffic between processing and memory units, which incurs high power and latency. To cope with the booming use of neural networks on edge devices, a promising way is to perform in-memory computing through exploiting the next-generation memristive devices. This work proposes a 2-level resistive random-access memory (RRAM)-based memory-augmented neural network (MANN), named binarized-all-through MANN (BATMANN), that is end-to-end trainable and allows both the controller and memory to be seamlessly integrated onto RRAM crossbars. Experiments then show the superiority of BATMANN in doing few-shot learning with high accuracy and robustness. |
Description | Session B3 : Computing-in/near-Memory II - no. 0359 |
Persistent Identifier | http://hdl.handle.net/10722/308265 |
ISSN | 2020 SCImago Journal Rankings: 0.125 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | REN, Y | - |
dc.contributor.author | LIN, R | - |
dc.contributor.author | RAN, J | - |
dc.contributor.author | LIU, C | - |
dc.contributor.author | TAO, C | - |
dc.contributor.author | Wang, Z | - |
dc.contributor.author | Li, C | - |
dc.contributor.author | Wong, N | - |
dc.date.accessioned | 2021-11-12T13:44:49Z | - |
dc.date.available | 2021-11-12T13:44:49Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | 2021 IEEE 14th International Conference on ASIC (ASICON 2021), Kunming, China, 26-29 October 2021, p. 1-4 | - |
dc.identifier.issn | 2162-7541 | - |
dc.identifier.uri | http://hdl.handle.net/10722/308265 | - |
dc.description | Session B3 : Computing-in/near-Memory II - no. 0359 | - |
dc.description.abstract | The traditional von Neumann architecture suffers from heavy data traffic between processing and memory units, which incurs high power and latency. To cope with the booming use of neural networks on edge devices, a promising way is to perform in-memory computing through exploiting the next-generation memristive devices. This work proposes a 2-level resistive random-access memory (RRAM)-based memory-augmented neural network (MANN), named binarized-all-through MANN (BATMANN), that is end-to-end trainable and allows both the controller and memory to be seamlessly integrated onto RRAM crossbars. Experiments then show the superiority of BATMANN in doing few-shot learning with high accuracy and robustness. | - |
dc.language | eng | - |
dc.publisher | IEEE. The Journal's web site is located at https://ieeexplore.ieee.org/xpl/conhome/1000054/all-proceedings | - |
dc.relation.ispartof | IEEE International Conference on ASIC Proceedings | - |
dc.rights | IEEE International Conference on ASIC Proceedings. Copyright © IEEE. | - |
dc.rights | ©2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | - |
dc.subject | RRAM | - |
dc.subject | memory augmented | - |
dc.subject | binary | - |
dc.subject | neural networks | - |
dc.subject | in-memory computing | - |
dc.title | BATMANN: A Binarized-All-Through Memory-Augmented Neural Network for Efficient In-Memory Computing | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Wang, Z: zrwang@eee.hku.hk | - |
dc.identifier.email | Li, C: canl@hku.hk | - |
dc.identifier.email | Wong, N: nwong@eee.hku.hk | - |
dc.identifier.authority | Wang, Z=rp02714 | - |
dc.identifier.authority | Li, C=rp02706 | - |
dc.identifier.authority | Wong, N=rp00190 | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/ASICON52560.2021.9620292 | - |
dc.identifier.scopus | eid_2-s2.0-85122852581 | - |
dc.identifier.hkuros | 329309 | - |
dc.identifier.spage | 1 | - |
dc.identifier.epage | 4 | - |
dc.publisher.place | United States | - |