File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: All You Need Is Feedback: Communication with Block Attention Feedback Codes

TitleAll You Need Is Feedback: Communication with Block Attention Feedback Codes
Authors
Keywordsattention mechanism
channel coding
deep learning
deep neural networks
Feedback code
self-attention
transformer
ultra-reliable short-packet communications
Issue Date2022
Citation
IEEE Journal on Selected Areas in Information Theory, 2022, v. 3, n. 3, p. 587-602 How to Cite?
AbstractDeep neural network (DNN)-based channel code designs have recently gained interest as an alternative to conventional coding schemes, particularly for channels in which existing codes do not provide satisfactory performance. Coding in the presence of feedback is one such problem, for which promising results have recently been obtained by various DNN-based coding architectures. In this paper, we introduce a novel learning-aided feedback code design, dubbed generalized block attention feedback (GBAF) codes, that achieves orders-of-magnitude improvements in block error rate (BLER) compared to existing solutions. Sequence-to-sequence encoding and block-by-block processing of the message bits are the two important design principles of the GBAF codes, which not only reduce the communication overhead, due to fewer interactions between the transmitter and receiver, but also enable flexible coding rates. GBAF codes also have a modular structure that can be implemented using different neural network architectures. In this work, we employ the popular transformer architecture, which outperforms all the prior DNN-based code designs in terms in terms of BLER BLER in the low signal-to-noise ratio regime when the feedback channel is noiseless.
Persistent Identifierhttp://hdl.handle.net/10722/363761

 

DC FieldValueLanguage
dc.contributor.authorOzfatura, Emre-
dc.contributor.authorShao, Yulin-
dc.contributor.authorPerotti, Alberto G.-
dc.contributor.authorPopovic, Branislav M.-
dc.contributor.authorGunduz, Deniz-
dc.date.accessioned2025-10-10T07:49:10Z-
dc.date.available2025-10-10T07:49:10Z-
dc.date.issued2022-
dc.identifier.citationIEEE Journal on Selected Areas in Information Theory, 2022, v. 3, n. 3, p. 587-602-
dc.identifier.urihttp://hdl.handle.net/10722/363761-
dc.description.abstractDeep neural network (DNN)-based channel code designs have recently gained interest as an alternative to conventional coding schemes, particularly for channels in which existing codes do not provide satisfactory performance. Coding in the presence of feedback is one such problem, for which promising results have recently been obtained by various DNN-based coding architectures. In this paper, we introduce a novel learning-aided feedback code design, dubbed generalized block attention feedback (GBAF) codes, that achieves orders-of-magnitude improvements in block error rate (BLER) compared to existing solutions. Sequence-to-sequence encoding and block-by-block processing of the message bits are the two important design principles of the GBAF codes, which not only reduce the communication overhead, due to fewer interactions between the transmitter and receiver, but also enable flexible coding rates. GBAF codes also have a modular structure that can be implemented using different neural network architectures. In this work, we employ the popular transformer architecture, which outperforms all the prior DNN-based code designs in terms in terms of BLER BLER in the low signal-to-noise ratio regime when the feedback channel is noiseless.-
dc.languageeng-
dc.relation.ispartofIEEE Journal on Selected Areas in Information Theory-
dc.subjectattention mechanism-
dc.subjectchannel coding-
dc.subjectdeep learning-
dc.subjectdeep neural networks-
dc.subjectFeedback code-
dc.subjectself-attention-
dc.subjecttransformer-
dc.subjectultra-reliable short-packet communications-
dc.titleAll You Need Is Feedback: Communication with Block Attention Feedback Codes-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/JSAIT.2022.3223901-
dc.identifier.scopuseid_2-s2.0-85148992464-
dc.identifier.volume3-
dc.identifier.issue3-
dc.identifier.spage587-
dc.identifier.epage602-
dc.identifier.eissn2641-8770-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats