File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Masked Spiking Transformer
Title | Masked Spiking Transformer |
---|---|
Authors | |
Issue Date | 6-Oct-2023 |
Abstract | The combination of Spiking Neural Networks (SNNs) and Transformers has attracted significant attention due to their potential for high energy efficiency and high-performance nature. However, existing works on this topic typically rely on direct training, which can lead to subop-timal performance. To address this issue, we propose to leverage the benefits of the ANN-to-SNN conversion method to combine SNNs and Transformers, resulting in significantly improved performance over existing state-of-the-art SNN models. Furthermore, inspired by the quantal synap-tic failures observed in the nervous system, which reduce the number of spikes transmitted across synapses, we introduce a novel Masked Spiking Transformer (MST) framework. This incorporates a Random Spike Masking (RSM) method to prune redundant spikes and reduce energy consumption without sacrificing performance. Our experimental results demonstrate that the proposed MST model achieves a significant reduction of 26.8% in power consumption when the masking ratio is 75% while maintaining the same level of performance as the unmasked model. The code is available at: https://github.com/bic-L/ Masked-Spiking-Transformer. |
Persistent Identifier | http://hdl.handle.net/10722/339391 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Wang, Ziqing | - |
dc.contributor.author | Fang, Yuetong | - |
dc.contributor.author | Cao, Jiahang | - |
dc.contributor.author | Zhang, Qiang | - |
dc.contributor.author | Wang, Zhongrui | - |
dc.contributor.author | Xu, Renjing | - |
dc.date.accessioned | 2024-03-11T10:36:15Z | - |
dc.date.available | 2024-03-11T10:36:15Z | - |
dc.date.issued | 2023-10-06 | - |
dc.identifier.uri | http://hdl.handle.net/10722/339391 | - |
dc.description.abstract | <p>The combination of Spiking Neural Networks (SNNs) and Transformers has attracted significant attention due to their potential for high energy efficiency and high-performance nature. However, existing works on this topic typically rely on direct training, which can lead to subop-timal performance. To address this issue, we propose to leverage the benefits of the ANN-to-SNN conversion method to combine SNNs and Transformers, resulting in significantly improved performance over existing state-of-the-art SNN models. Furthermore, inspired by the quantal synap-tic failures observed in the nervous system, which reduce the number of spikes transmitted across synapses, we introduce a novel Masked Spiking Transformer (MST) framework. This incorporates a Random Spike Masking (RSM) method to prune redundant spikes and reduce energy consumption without sacrificing performance. Our experimental results demonstrate that the proposed MST model achieves a significant reduction of 26.8% in power consumption when the masking ratio is 75% while maintaining the same level of performance as the unmasked model. The code is available at: https://github.com/bic-L/ Masked-Spiking-Transformer.</p> | - |
dc.language | eng | - |
dc.relation.ispartof | 2023 International Conference on Computer Vision (ICCV) (02/10/2023-06/10/2023, , , Paris) | - |
dc.title | Masked Spiking Transformer | - |
dc.type | Conference_Paper | - |