File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.23919/DATE56975.2023.10137218
- Scopus: eid_2-s2.0-85162732154
- WOS: WOS:001027444200241
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: PECAN: A Product-Quantized Content Addressable Memory Network
Title | PECAN: A Product-Quantized Content Addressable Memory Network |
---|---|
Authors | |
Keywords | DNN compression in-memory computing product quantization |
Issue Date | 17-Apr-2023 |
Publisher | IEEE |
Abstract | A novel deep neural network (DNN) architecture is proposed wherein the filtering and linear transform are realized solely with product quantization (PQ). This results in a natural implementation via content addressable memory (CAM), which transcends regular DNN layer operations and requires only simple table lookup. Two schemes are developed for the end-to-end PQ prototype training, namely, through angle- and distance-based similarities, which differ in their multiplicative and additive natures with different complexity-accuracy tradeoffs. Even more, the distance-based scheme constitutes a truly multiplier-free DNN solution. Experiments confirm the feasibility of such Product-QuantizEd Content Addressable Memory Network (PECAN), which has strong implication on hardware-efficient deployments especially for in-memory computing. |
Persistent Identifier | http://hdl.handle.net/10722/339482 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ran, Jie | - |
dc.contributor.author | Lin, Rui | - |
dc.contributor.author | Li, Lok Chun Jason | - |
dc.contributor.author | Zhou, Jiajun | - |
dc.contributor.author | Wong, Ngai | - |
dc.date.accessioned | 2024-03-11T10:37:00Z | - |
dc.date.available | 2024-03-11T10:37:00Z | - |
dc.date.issued | 2023-04-17 | - |
dc.identifier.uri | http://hdl.handle.net/10722/339482 | - |
dc.description.abstract | <p>A novel deep neural network (DNN) architecture is proposed wherein the filtering and linear transform are realized solely with product quantization (PQ). This results in a natural implementation via content addressable memory (CAM), which transcends regular DNN layer operations and requires only simple table lookup. Two schemes are developed for the end-to-end PQ prototype training, namely, through angle- and distance-based similarities, which differ in their multiplicative and additive natures with different complexity-accuracy tradeoffs. Even more, the distance-based scheme constitutes a truly multiplier-free DNN solution. Experiments confirm the feasibility of such Product-QuantizEd Content Addressable Memory Network (PECAN), which has strong implication on hardware-efficient deployments especially for in-memory computing.<br></p> | - |
dc.language | eng | - |
dc.publisher | IEEE | - |
dc.relation.ispartof | 2023 Design, Automation & Test in Europe Conference & Exhibition (DATE) (17/04/2023-19/04/2023, Antwerp) | - |
dc.subject | DNN compression | - |
dc.subject | in-memory computing | - |
dc.subject | product quantization | - |
dc.title | PECAN: A Product-Quantized Content Addressable Memory Network | - |
dc.type | Conference_Paper | - |
dc.identifier.doi | 10.23919/DATE56975.2023.10137218 | - |
dc.identifier.scopus | eid_2-s2.0-85162732154 | - |
dc.identifier.volume | 2023-April | - |
dc.identifier.isi | WOS:001027444200241 | - |