File Download
  Links for fulltext
     (May Require Subscription)
  • Find via Find It@HKUL
Supplementary

Conference Paper: Deformable Butterfly: A Highly Structured and Sparse Linear Transform

TitleDeformable Butterfly: A Highly Structured and Sparse Linear Transform
Authors
KeywordsDeformable Butterfly
Linear transform
Model compression
Issue Date2021
PublisherNeural Information Processing Systems Foundation, Inc. The Journal's web site is located at https://papers.nips.cc/
Citation
35th Conference on Neural Information Processing Systems (NeurIPS), Virtual Conference, 7-10 December 2021. In Ranzato, M ... et al (eds.), Advances in Neural Information Processing Systems 34 (NIPS 2021) pre-proceedings How to Cite?
AbstractWe introduce a new kind of linear transform named Deformable Butterfly (DeBut) that generalizes the conventional butterfly matrices and can be adapted to various input-output dimensions. It inherits the fine-to-coarse-grained learnable hierarchy of traditional butterflies and when deployed to neural networks, the prominent structures and sparsity in a DeBut layer constitutes a new way for network compression. We apply DeBut as a drop-in replacement of standard fully connected and convolutional layers, and demonstrate its superiority in homogenizing a neural network and rendering it favorable properties such as light weight and low inference complexity, without compromising accuracy. The natural complexity-accuracy tradeoff arising from the myriad deformations of a DeBut layer also opens up new rooms for analytical and practical research. The codes and Appendix are publicly available at: https://github.com/ruilin0212/DeBut.
DescriptionPoster Presentation at Spot C2 in Virtual World
Persistent Identifierhttp://hdl.handle.net/10722/307964
ISSN
2020 SCImago Journal Rankings: 1.399

 

DC FieldValueLanguage
dc.contributor.authorLIN, R-
dc.contributor.authorRAN, J-
dc.contributor.authorChiu, KH-
dc.contributor.authorChesi, G-
dc.contributor.authorWong, N-
dc.date.accessioned2021-11-12T13:40:28Z-
dc.date.available2021-11-12T13:40:28Z-
dc.date.issued2021-
dc.identifier.citation35th Conference on Neural Information Processing Systems (NeurIPS), Virtual Conference, 7-10 December 2021. In Ranzato, M ... et al (eds.), Advances in Neural Information Processing Systems 34 (NIPS 2021) pre-proceedings-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/307964-
dc.descriptionPoster Presentation at Spot C2 in Virtual World-
dc.description.abstractWe introduce a new kind of linear transform named Deformable Butterfly (DeBut) that generalizes the conventional butterfly matrices and can be adapted to various input-output dimensions. It inherits the fine-to-coarse-grained learnable hierarchy of traditional butterflies and when deployed to neural networks, the prominent structures and sparsity in a DeBut layer constitutes a new way for network compression. We apply DeBut as a drop-in replacement of standard fully connected and convolutional layers, and demonstrate its superiority in homogenizing a neural network and rendering it favorable properties such as light weight and low inference complexity, without compromising accuracy. The natural complexity-accuracy tradeoff arising from the myriad deformations of a DeBut layer also opens up new rooms for analytical and practical research. The codes and Appendix are publicly available at: https://github.com/ruilin0212/DeBut.-
dc.languageeng-
dc.publisherNeural Information Processing Systems Foundation, Inc. The Journal's web site is located at https://papers.nips.cc/-
dc.relation.ispartof35th Conference on Neural Information Processing Systems (NeurIPS), 2021-
dc.relation.ispartofAdvances in Neural Information Processing Systems 34 (NIPS 2021 Proceedings)-
dc.subjectDeformable Butterfly-
dc.subjectLinear transform-
dc.subjectModel compression-
dc.titleDeformable Butterfly: A Highly Structured and Sparse Linear Transform-
dc.typeConference_Paper-
dc.identifier.emailChesi, G: chesi@eee.hku.hk-
dc.identifier.emailWong, N: nwong@eee.hku.hk-
dc.identifier.authorityChesi, G=rp00100-
dc.identifier.authorityWong, N=rp00190-
dc.description.naturepublished_or_final_version-
dc.identifier.hkuros329307-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats