File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Input-Constrained Erasure Channels: Mutual Information and Capacity

TitleInput-Constrained Erasure Channels: Mutual Information and Capacity
Authors
Issue Date2014
PublisherI E E E. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000369
Citation
I E E E International Symposium on Information Theory (ISIT), Honolulu, USA, 29 June-4 July 4 2014. In I E E E International Symposium on Information Theory Proceedings, 2014, p. 3072-3076 How to Cite?
AbstractIn this paper, we derive an explicit formula for the entropy rate of a hidden Markov chain, observed when the Markov chain passes through a memoryless erasure channel. This result naturally leads to an explicit formula for the mutual information rate of memoryless erasure channels with Markovian inputs. Moreover, if the input Markov chain is of first-order and supported on the (1,∞)-run length limited (RLL) constraint, we show that the mutual information rate is strictly concave with respect to a chosen parameter. Then we apply a recent algorithm [1] to approximately compute the first-order noisy constrained channel capacity and the corresponding capacity-achieving distribution.
Persistent Identifierhttp://hdl.handle.net/10722/204178
ISBN
ISSN

 

DC FieldValueLanguage
dc.contributor.authorLi, Yen_US
dc.contributor.authorHan, Gen_US
dc.date.accessioned2014-09-19T21:19:52Z-
dc.date.available2014-09-19T21:19:52Z-
dc.date.issued2014en_US
dc.identifier.citationI E E E International Symposium on Information Theory (ISIT), Honolulu, USA, 29 June-4 July 4 2014. In I E E E International Symposium on Information Theory Proceedings, 2014, p. 3072-3076en_US
dc.identifier.isbn9781479951864-
dc.identifier.issn0271-4655-
dc.identifier.urihttp://hdl.handle.net/10722/204178-
dc.description.abstractIn this paper, we derive an explicit formula for the entropy rate of a hidden Markov chain, observed when the Markov chain passes through a memoryless erasure channel. This result naturally leads to an explicit formula for the mutual information rate of memoryless erasure channels with Markovian inputs. Moreover, if the input Markov chain is of first-order and supported on the (1,∞)-run length limited (RLL) constraint, we show that the mutual information rate is strictly concave with respect to a chosen parameter. Then we apply a recent algorithm [1] to approximately compute the first-order noisy constrained channel capacity and the corresponding capacity-achieving distribution.en_US
dc.languageengen_US
dc.publisherI E E E. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000369en_US
dc.relation.ispartofI E E E International Symposium on Information Theory Proceedingsen_US
dc.titleInput-Constrained Erasure Channels: Mutual Information and Capacityen_US
dc.typeConference_Paperen_US
dc.identifier.emailHan, G: ghan@hku.hken_US
dc.identifier.authorityHan, G=rp00702en_US
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/ISIT.2014.6875399en_US
dc.identifier.scopuseid_2-s2.0-84906535619-
dc.identifier.hkuros237850en_US
dc.identifier.spage3072en_US
dc.identifier.epage3076en_US
dc.publisher.placeUnited States-
dc.identifier.issnl0271-4655-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats