File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/JSAIT.2022.3223901
- Scopus: eid_2-s2.0-85148992464
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: All You Need Is Feedback: Communication with Block Attention Feedback Codes
| Title | All You Need Is Feedback: Communication with Block Attention Feedback Codes |
|---|---|
| Authors | |
| Keywords | attention mechanism channel coding deep learning deep neural networks Feedback code self-attention transformer ultra-reliable short-packet communications |
| Issue Date | 2022 |
| Citation | IEEE Journal on Selected Areas in Information Theory, 2022, v. 3, n. 3, p. 587-602 How to Cite? |
| Abstract | Deep neural network (DNN)-based channel code designs have recently gained interest as an alternative to conventional coding schemes, particularly for channels in which existing codes do not provide satisfactory performance. Coding in the presence of feedback is one such problem, for which promising results have recently been obtained by various DNN-based coding architectures. In this paper, we introduce a novel learning-aided feedback code design, dubbed generalized block attention feedback (GBAF) codes, that achieves orders-of-magnitude improvements in block error rate (BLER) compared to existing solutions. Sequence-to-sequence encoding and block-by-block processing of the message bits are the two important design principles of the GBAF codes, which not only reduce the communication overhead, due to fewer interactions between the transmitter and receiver, but also enable flexible coding rates. GBAF codes also have a modular structure that can be implemented using different neural network architectures. In this work, we employ the popular transformer architecture, which outperforms all the prior DNN-based code designs in terms in terms of BLER BLER in the low signal-to-noise ratio regime when the feedback channel is noiseless. |
| Persistent Identifier | http://hdl.handle.net/10722/363761 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Ozfatura, Emre | - |
| dc.contributor.author | Shao, Yulin | - |
| dc.contributor.author | Perotti, Alberto G. | - |
| dc.contributor.author | Popovic, Branislav M. | - |
| dc.contributor.author | Gunduz, Deniz | - |
| dc.date.accessioned | 2025-10-10T07:49:10Z | - |
| dc.date.available | 2025-10-10T07:49:10Z | - |
| dc.date.issued | 2022 | - |
| dc.identifier.citation | IEEE Journal on Selected Areas in Information Theory, 2022, v. 3, n. 3, p. 587-602 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/363761 | - |
| dc.description.abstract | Deep neural network (DNN)-based channel code designs have recently gained interest as an alternative to conventional coding schemes, particularly for channels in which existing codes do not provide satisfactory performance. Coding in the presence of feedback is one such problem, for which promising results have recently been obtained by various DNN-based coding architectures. In this paper, we introduce a novel learning-aided feedback code design, dubbed generalized block attention feedback (GBAF) codes, that achieves orders-of-magnitude improvements in block error rate (BLER) compared to existing solutions. Sequence-to-sequence encoding and block-by-block processing of the message bits are the two important design principles of the GBAF codes, which not only reduce the communication overhead, due to fewer interactions between the transmitter and receiver, but also enable flexible coding rates. GBAF codes also have a modular structure that can be implemented using different neural network architectures. In this work, we employ the popular transformer architecture, which outperforms all the prior DNN-based code designs in terms in terms of BLER BLER in the low signal-to-noise ratio regime when the feedback channel is noiseless. | - |
| dc.language | eng | - |
| dc.relation.ispartof | IEEE Journal on Selected Areas in Information Theory | - |
| dc.subject | attention mechanism | - |
| dc.subject | channel coding | - |
| dc.subject | deep learning | - |
| dc.subject | deep neural networks | - |
| dc.subject | Feedback code | - |
| dc.subject | self-attention | - |
| dc.subject | transformer | - |
| dc.subject | ultra-reliable short-packet communications | - |
| dc.title | All You Need Is Feedback: Communication with Block Attention Feedback Codes | - |
| dc.type | Article | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1109/JSAIT.2022.3223901 | - |
| dc.identifier.scopus | eid_2-s2.0-85148992464 | - |
| dc.identifier.volume | 3 | - |
| dc.identifier.issue | 3 | - |
| dc.identifier.spage | 587 | - |
| dc.identifier.epage | 602 | - |
| dc.identifier.eissn | 2641-8770 | - |
