HKU Scholars Hubhttp://hub.hku.hkThe DSpace digital repository system captures, stores, indexes, preserves, and distributes digital research material.Tue, 17 Sep 2024 06:29:36 GMT2024-09-17T06:29:36Z50771- On Mergings in Acyclic Directed Graphshttp://hdl.handle.net/10722/277238Title: On Mergings in Acyclic Directed Graphs
Authors: Han, G
Abstract: Consider an acyclic directed graph $G$ with sources $s_1, s_2, ldots,s_n$ and sinks $r_1, r_2, ldots, r_n$. For $i=1, 2, ldots,n$, let $c_i$ denote the size of the minimum edge cut between $s_i$ and $r_i$, which, by Menger's theorem, implies that there exists a group of $c_i$ edge-disjoint paths from $s_i$ to $r_i$. Although they are edge disjoint within the same group, the above-mentioned edge-disjoint paths from different groups may merge with each other (or, roughly speaking, share a common subpath). In this paper we show that by choosing these paths appropriately, the number of mergings among all these edge-disjoint paths is always bounded by a finite function $mathcal{M}(c_1, c_2, ldots, c_n)$, which is independent of the size of $G$. Moreover, we prove some elementary properties of $mathcal{M}(c_1, c_2, dots, c_n)$, derive exact values of $mathcal{M}(1, c)$ and $mathcal{M}(2, c)$, and establish a scaling law of $mathcal{M}(c_1, c_2)$ when one of the parameters is fixed.
Tue, 01 Jan 2019 00:00:00 GMThttp://hdl.handle.net/10722/2772382019-01-01T00:00:00Z
- On continuous-time Gaussian channelshttp://hdl.handle.net/10722/277239Title: On continuous-time Gaussian channels
Authors: Liu, X; Han, G
Abstract: A continuous-time white Gaussian channel can be formulated using a white Gaussian noise, and a conventional way for examining such a channel is the sampling approach based on the Shannon–Nyquist sampling theorem, where the original continuous-time channel is converted to an equivalent discrete-time channel, to which a great variety of established tools and methodology can be applied. However, one of the key issues of this scheme is that continuous-time feedback and memory cannot be incorporated into the channel model. It turns out that this issue can be circumvented by considering the Brownian motion formulation of a continuous-time white Gaussian channel. Nevertheless, as opposed to the white Gaussian noise formulation, a link that establishes the information-theoretic connection between a continuous-time channel under the Brownian motion formulation and its discrete-time counterparts has long been missing. This paper is to fill this gap by establishing causality-preserving connections between continuous-time Gaussian feedback/memory channels and their associated discrete-time versions in the forms of sampling and approximation theorems, which we believe will play important roles in the long run for further developing continuous-time information theory. As an immediate application of the approximation theorem, we propose the so-called approximation approach to examine continuous-time white Gaussian channels in the point-to-point or multi-user setting. It turns out that the approximation approach, complemented by relevant tools from stochastic calculus, can enhance our understanding of continuous-time Gaussian channels in terms of giving alternative and strengthened interpretation to some long-held folklore, recovering “long-known” results from new perspectives, and rigorously establishing new results predicted by the intuition that the approximation approach carries. More specifically, using the approximation approach complemented by relevant tools from stochastic calculus, we first derive the capacity regions of continuous-time white Gaussian multiple access channels and broadcast channels, and we then analyze how feedback affects their capacity regions: feedback will increase the capacity regions of some continuous-time white Gaussian broadcast channels and interference channels, while it will not increase capacity regions of continuous-time white Gaussian multiple access channels.
Tue, 01 Jan 2019 00:00:00 GMThttp://hdl.handle.net/10722/2772392019-01-01T00:00:00Z
- Feedback capacity of stationary Gaussian channels further examinedhttp://hdl.handle.net/10722/277449Title: Feedback capacity of stationary Gaussian channels further examined
Authors: Liu, T; Han, G
Abstract: It is well known that the problem of computing the feedback capacity of a stationary Gaussian channel can be recast as an infinite-dimensional optimization problem; moreover, necessary and sufficient conditions for the optimality of a solution to this optimization problem have been characterized, and based on this characterization, an explicit formula for the feedback capacity has been given for the case that the noise is a first-order autoregressive moving-average Gaussian process. In this paper, via a simple “change of variables” trick, we further examine the above-mentioned infinite-dimensional optimization problem. We prove that unless the Gaussian noise is white, its optimal solution is unique, and we propose an algorithm to recursively compute the unique optimal solution, which is guaranteed to converge in theory and features an efficient implementation for a suboptimal solution in practice. Furthermore, for the case, that the noise is a $k$ -th order autoregressive moving-average Gaussian process, we give a relatively more explicit formula for the feedback capacity; more specifically, the feedback capacity is expressed as a simple function evaluated at a solution to a system of polynomial equations.
Tue, 01 Jan 2019 00:00:00 GMThttp://hdl.handle.net/10722/2774492019-01-01T00:00:00Z
- Concavity of mutual information rate for input-restricted finite-state memoryless channels at high SNRhttp://hdl.handle.net/10722/62185Title: Concavity of mutual information rate for input-restricted finite-state memoryless channels at high SNR
Authors: Han, G; Marcus, B
Abstract: We consider a finite-state memoryless channel with i.i.d. channel state and the input Markov process supported on a mixing finite-type constraint. We discuss the asymptotic behavior of entropy rate of the output hidden Markov chain and deduce that the mutual information rate of such a channel is concave with respect to the parameters of the input Markov processes at high signal-to-noise ratio. In principle, the concavity result enables good numerical approximation of the maximum mutual information rate and capacity of such a channel. © 2009 IEEE.
Description: 2009 IEEE International Symposium on Information Theory
Thu, 01 Jan 2009 00:00:00 GMThttp://hdl.handle.net/10722/621852009-01-01T00:00:00Z
- Menger's paths with minimum mergingshttp://hdl.handle.net/10722/62186Title: Menger's paths with minimum mergings
Authors: Han, G
Abstract: For an acyclic directed graph with multiple sources and multiple sinks, we prove that one can choose the Menger's paths between the sources and the sinks such that the number of mergings between these paths is upper bounded by a constant depending only on the min-cuts between the sources and the sinks, regardless of the size and topology of the graph. We also give bounds on the minimum number of mergings between these paths, and discuss how it depends on the min-cuts. © 2009 IEEE.
Description: 2009 IEEE Information Theory Workshop on Networking and Information Theory
Thu, 01 Jan 2009 00:00:00 GMThttp://hdl.handle.net/10722/621862009-01-01T00:00:00Z
- A Strengthening of the Perron-Frobenius Theoremhttp://hdl.handle.net/10722/299246Title: A Strengthening of the Perron-Frobenius Theorem
Authors: Han, G
Description: 主办单位: 中山大学计算机科学系
Tue, 01 Jan 2019 00:00:00 GMThttp://hdl.handle.net/10722/2992462019-01-01T00:00:00Z
- A Wong-Zakai approximation of stochastic differential equations driven by a general semimartingalehttp://hdl.handle.net/10722/290587Title: A Wong-Zakai approximation of stochastic differential equations driven by a general semimartingale
Authors: Liu, X; Han, G
Abstract: We examine a Wong-Zakai type approximation of a family of stochastic differential equations driven by a general càdlàg semimartingale. For such an approximation, compared with the pointwise convergence result by Kurtz, Pardoux and Protter [10,Theorem 6.5], we establish stronger convergence results under the Skorokhod M1-topology, which, among other possible applications, implies the convergence of the first passage time of the solution to the approximating stochastic differential equation.
Fri, 01 Jan 2021 00:00:00 GMThttp://hdl.handle.net/10722/2905872021-01-01T00:00:00Z
- On Sampling Continuous-Time Gaussian Channelshttp://hdl.handle.net/10722/310799Title: On Sampling Continuous-Time Gaussian Channels
Authors: Han, G
http://hdl.handle.net/10722/310799
- Information-Theoretic Extensions of the Shannon-Nyquist Sampling Theoremhttp://hdl.handle.net/10722/310800Title: Information-Theoretic Extensions of the Shannon-Nyquist Sampling Theorem
Authors: Han, G
Tue, 01 Jan 2019 00:00:00 GMThttp://hdl.handle.net/10722/3108002019-01-01T00:00:00Z
- Extensions of the I-MMSE Relationhttp://hdl.handle.net/10722/204177Title: Extensions of the I-MMSE Relation
Authors: Han, G; Song, J
Abstract: Unveiling a fundamental link between information theory and estimation theory, the I-MMSE relation by Guo, Shamai and Verdú [4] has great theoretical significance and numerous practical applications. On the other hand, its influences to date have been restricted to channels without feedback or memory, due to the lack of extensions of the I-MMSE relation to such channels. In this paper, we propose extensions of the I-MMSE relation for discrete and continuous-time Gaussian channels with feedback or memory. Our approach is based on a very simple observation, which can be applied to other scenarios, such as a simple and direct proof of the classical de Bruijn's identity.
Wed, 01 Jan 2014 00:00:00 GMThttp://hdl.handle.net/10722/2041772014-01-01T00:00:00Z
- Input-Constrained Erasure Channels: Mutual Information and Capacityhttp://hdl.handle.net/10722/204178Title: Input-Constrained Erasure Channels: Mutual Information and Capacity
Authors: Li, Y; Han, G
Abstract: In this paper, we derive an explicit formula for the entropy rate of a hidden Markov chain, observed when the Markov chain passes through a memoryless erasure channel. This result naturally leads to an explicit formula for the mutual information rate of memoryless erasure channels with Markovian inputs. Moreover, if the input Markov chain is of first-order and supported on the (1,∞)-run length limited (RLL) constraint, we show that the mutual information rate is strictly concave with respect to a chosen parameter. Then we apply a recent algorithm [1] to approximately compute the first-order noisy constrained channel capacity and the corresponding capacity-achieving distribution.
Wed, 01 Jan 2014 00:00:00 GMThttp://hdl.handle.net/10722/2041782014-01-01T00:00:00Z
- Recent Results in Continuous-Time Network Information Theoryhttp://hdl.handle.net/10722/204179Title: Recent Results in Continuous-Time Network Information Theory
Authors: Liu, X; Han, G
Abstract: In this paper, we propose to use Brownian motions to formulate continuous-time multiuser Gaussian networks and derive the capacity regions of a continuous-time white Gaussian multiple access channel with/without feedback, a continuous-time white Gaussian interference channel without feedback and a continuous-time white Gaussian broadcast channel without feedback. These “complete” results stand in stark contrast to the status quo of network information theory in discrete-time, where the capacity regions of the all the above-mentioned channels are known only for a handful of special scenarios. For certain cases, our results echo, from a different perspective, the folklore that “a continuous-time channel is the limit of bandwidth limited discrete-time ones as the bandwidth tends to infinity”.
Wed, 01 Jan 2014 00:00:00 GMThttp://hdl.handle.net/10722/2041792014-01-01T00:00:00Z
- On the Solvability of Three-Pair Networks With Common Bottleneck Linkshttp://hdl.handle.net/10722/204180Title: On the Solvability of Three-Pair Networks With Common Bottleneck Links
Authors: Cai, K; Han, G
Abstract: We consider the solvability problem under network coding and derive a sufficient and necessary condition for 3-pair networks with common “bottleneck links” being solvable. We show that, for such networks: (1) the solvability can be determined in polynomial time; (2) being solvable is equivalent to being linear solvable; (3) finite fields of size 2 or 3 are sufficient to construct linear solutions.
Description: Session We2A: Network Coding II
Wed, 01 Jan 2014 00:00:00 GMThttp://hdl.handle.net/10722/2041802014-01-01T00:00:00Z
- A Randomized Algorithm for the Capacity of Finite-state Channelshttp://hdl.handle.net/10722/236605Title: A Randomized Algorithm for the Capacity of Finite-state Channels
Authors: Han, G
Abstract: Inspired by ideas from the field of stochastic approximation, we propose a randomized algorithm to compute the capacity of a finite-state channel with a Markovian input. When the mutual information rate of the channel is concave with respect to the chosen parameterization, the proposed algorithm proves to be convergent to the capacity of the channel almost surely with the derived convergence rate. We also discuss the convergence behavior of the algorithm without the concavity assumption.
Fri, 01 Jan 2016 00:00:00 GMThttp://hdl.handle.net/10722/2366052016-01-01T00:00:00Z
- On Renyi entropy rate of hidden Markov processeshttp://hdl.handle.net/10722/236607Title: On Renyi entropy rate of hidden Markov processes
Authors: Han, G
Abstract: We prove that under mild assumptions, the Renyi entropy rate of hidden Markov processes exists with a tight convergence rate. Possible applications of this result and some related open problems will be discussed as well.
Fri, 01 Jan 2016 00:00:00 GMThttp://hdl.handle.net/10722/2366072016-01-01T00:00:00Z
- Sampling theorems for continuous-time Gaussian channels http://hdl.handle.net/10722/236608Title: Sampling theorems for continuous-time Gaussian channels
Authors: Han, G
Abstract: We prove sampling theorems for continuous-time Gaussian channels with possible feedback. More specifically, we show that the mutual information of the discrete-time versions of a class of continuous-time Gaussian channels, obtained through sampling in time, will converges to the mutual information of the original Gaussian channel, as the step-sizes of the samplings tend to zero.
Thu, 01 Jan 2015 00:00:00 GMThttp://hdl.handle.net/10722/2366082015-01-01T00:00:00Z
- Menger's Paths with Minimum Mergingshttp://hdl.handle.net/10722/241365Title: Menger's Paths with Minimum Mergings
Authors: Han, G
Sat, 01 Jan 2011 00:00:00 GMThttp://hdl.handle.net/10722/2413652011-01-01T00:00:00Z
- Limit Theorems for the Sample Entropy of Hidden Markov Chainshttp://hdl.handle.net/10722/241361Title: Limit Theorems for the Sample Entropy of Hidden Markov Chains
Authors: Han, G
Description: Session: Shannon Theory
Sat, 01 Jan 2011 00:00:00 GMThttp://hdl.handle.net/10722/2413612011-01-01T00:00:00Z
- Menger’s Paths with Minimum Mergings: A Generalization of Menger’s Theoremhttp://hdl.handle.net/10722/241363Title: Menger’s Paths with Minimum Mergings: A Generalization of Menger’s Theorem
Authors: Han, G
Abstract: For an acyclic directed graph with multiple sources and multiple sinks, we prove that one can choose the Menger’s paths between the sources and the sinks such that the number of mergings between these paths is upper bounded by a constant depending only on the min-cuts between the sources and the sinks, regardless of the size and topology of the graph. We also give bounds on the minimum number of mergings between these paths, and discuss how it depends on the min-cuts.
Description: Session: Combinatorial Topics in Coding and Information Theory
Fri, 01 Jan 2010 00:00:00 GMThttp://hdl.handle.net/10722/2413632010-01-01T00:00:00Z
- On the capacity of multilevel NAND flash memory channelshttp://hdl.handle.net/10722/233953Title: On the capacity of multilevel NAND flash memory channels
Authors: Li, Y; Kavcic, A; Han, G
Abstract: In this paper, we initiate a first information-theoretic study on multilevel NAND flash memory channels [2] with intercell interference. More specifically, for a multilevel NAND flash memory channel under mild assumptions, we first prove that such a channel is indecomposable and it features asymptotic equipartition property; we then further prove that stationary processes achieve its information capacity, and consequently, as its order tends to infinity, its Markov capacity converges to its information capacity; eventually, we establish that its operational capacity is equal to its information capacity. Our results suggest that it is highly plausible to apply the ideas and techniques in the computation of the capacity of finite-state channels, which are relatively better explored, to that of the capacity of multilevel NAND flash memory channels. © 2016 IEEE.
Fri, 01 Jan 2016 00:00:00 GMThttp://hdl.handle.net/10722/2339532016-01-01T00:00:00Z
- On Network Hubshttp://hdl.handle.net/10722/242726Title: On Network Hubs
Authors: Han, G
Abstract: Consider a network G = (V, E), where V denotes the set of vertices in G, and E denotes the set of edges in G. A vertex in G is said to be a source if it is only incident with outgoing edges, and a sink if it is only incident with incoming edges. Often, a source or sink is referred to as a terminal vertex. A non-terminal vertex is said to be a hub if its degree is greater than or equal to 3. In this paper, we are primarily concerned with the minimum number of hubs needed when certain constraints on the flow demand between multiple pairs of sources and sinks are imposed. The flow demand constraints considered in this paper will be in terms of the vertex-cuts between pairs of sources and sinks. This can be justified by a vertex version of the max-flow min-cut theorem [1], which states that for a network with infinite edge-capacity and unit vertex-capacity, the maximum flow between one source and one sink is equal to the minimum vertex-cut between them. Here, we remark that with appropriately modified setup, our results can be stated in terms of edge-cuts as well.
Wed, 01 Jan 2014 00:00:00 GMThttp://hdl.handle.net/10722/2427262014-01-01T00:00:00Z
- On the Langberg–Médard multiple unicast conjecturehttp://hdl.handle.net/10722/245055Title: On the Langberg–Médard multiple unicast conjecture
Authors: Cai, K; Han, G
Abstract: A version of the multiple unicast conjecture, proposed by Langberg and Médard (in: Proceedings of 47th annual Allerton, 2009), says that, there exists an undirected fractional multi-commodity flow, or simply, multi-flow, with rate (1,1,…,1) for strongly reachable networks. In this paper, we propose a nonsmooth matrix optimization problem to attack this conjecture: By giving upper and lower bounds on the objective value, we prove that there exists a multi-flow with rate at least (8/9,8/9,…,8/9) for such networks; on the other hand though, we show that the rate of any multi-flow constructed using this framework cannot exceed (1,1,…,1).
Sun, 01 Jan 2017 00:00:00 GMThttp://hdl.handle.net/10722/2450552017-01-01T00:00:00Z
- Rényi entropy rate of hidden Markov processeshttp://hdl.handle.net/10722/247843Title: Rényi entropy rate of hidden Markov processes
Authors: Wu, C; Xu, L; Han, G
Abstract: In this paper, we focus our attention on the Rényi entropy rate of hidden Markov processes under certain positivity assumptions. The existence of the Rényi entropy rate for such processes is established. Furthermore, we show that, with some extra “fast-forgetting” assumptions, the Rényi entropy rate of the approximating Markov processes exponentially converges to that of the original hidden Markov process, as the Markov order goes to infinity.
Sun, 01 Jan 2017 00:00:00 GMThttp://hdl.handle.net/10722/2478432017-01-01T00:00:00Z
- The ARMA(k) Gaussian feedback capacityhttp://hdl.handle.net/10722/247844Title: The ARMA(k) Gaussian feedback capacity
Authors: Liu, T; Han, G
Abstract: Using Kim's variational formulation [1] (with a slight yet important modification), we derive the ARMA(fc) Gaussian feedback capacity, i.e., the feedback capacity of an additive channel where the noise is a k-th order autoregressive moving average Gaussian process. More specifically, the ARMA(fc) Gaussian feedback capacity is expressed as a simple function evaluated at a solution to a system of polynomial equations, which proves to have only finitely many solutions for the cases k = 1,2 and possibly beyond.
Sun, 01 Jan 2017 00:00:00 GMThttp://hdl.handle.net/10722/2478442017-01-01T00:00:00Z
- Network encoding complexity: exact values, bounds and inequalitieshttp://hdl.handle.net/10722/247477Title: Network encoding complexity: exact values, bounds and inequalities
Authors: Xu, L; Shang, W; Han, G
Sun, 01 Jan 2017 00:00:00 GMThttp://hdl.handle.net/10722/2474772017-01-01T00:00:00Z
- On the criteria for designing complex orthogonal space-time block codeshttp://hdl.handle.net/10722/247475Title: On the criteria for designing complex orthogonal space-time block codes
Authors: Kan, H; Liu, X; Han, G
Fri, 01 Jan 2016 00:00:00 GMThttp://hdl.handle.net/10722/2474752016-01-01T00:00:00Z
- Capacity of multilevel NAND flash memory channelshttp://hdl.handle.net/10722/247476Title: Capacity of multilevel NAND flash memory channels
Authors: Li, Y; Kavcic, A; Han, G
Sun, 01 Jan 2017 00:00:00 GMThttp://hdl.handle.net/10722/2474762017-01-01T00:00:00Z
- The ARMK(k) Gaussian feedback capacityhttp://hdl.handle.net/10722/252556Title: The ARMK(k) Gaussian feedback capacity
Authors: Han, G
Abstract: Using Kim's variational formulation (with a slight yet important modification), we derive the ARMA(k) Gaussian feedback capacity, i.e., the feedback capacity of an additive channel where the noise is a k-th order autoregressive moving average Gaussian process. More specifically, the ARMA(k) Gaussian feedback capacity is expressed as a simple function evaluated at a solution to a system of polynomial equations, which proves to have only finitely many solutions for some small k and possibly beyond.
Description: Feedback capacity
Sun, 01 Jan 2017 00:00:00 GMThttp://hdl.handle.net/10722/2525562017-01-01T00:00:00Z
- On Sampling Theoremshttp://hdl.handle.net/10722/252562Title: On Sampling Theorems
Authors: Han, G
Fri, 01 Jan 2016 00:00:00 GMThttp://hdl.handle.net/10722/2525622016-01-01T00:00:00Z
- Recent Results on Input-Constrained Erasure Channels: A Case Study for Markov Approximationhttp://hdl.handle.net/10722/252564Title: Recent Results on Input-Constrained Erasure Channels: A Case Study for Markov Approximation
Authors: Li, Y; Han, G
Sun, 01 Jan 2017 00:00:00 GMThttp://hdl.handle.net/10722/2525642017-01-01T00:00:00Z
- Limit theorems in hidden Markov modelshttp://hdl.handle.net/10722/189143Title: Limit theorems in hidden Markov models
Authors: Han, G
Tue, 01 Jan 2013 00:00:00 GMThttp://hdl.handle.net/10722/1891432013-01-01T00:00:00Z
- One-Dimensional Markov Random Fields, Markov Chains and Topological Markov Fieldshttp://hdl.handle.net/10722/202986Title: One-Dimensional Markov Random Fields, Markov Chains and Topological Markov Fields
Authors: Chandgotia, N; Han, G; Marcus, B; Meyerovitch, T; Pavlov, R
Abstract: A topological Markov chain is the support of an ordinary first-order Markov chain. We develop the concept of topological Markov field (TMF), which is the support of a Markov random field. Using this, we show that any one-dimensional (discrete-time, finite-alphabet) stationary Markov random field must be a stationary Markov chain, and we give a version of this result for continuous-time processes. We also give a general finite procedure for deciding if a given shift space is a TMF.
Wed, 01 Jan 2014 00:00:00 GMThttp://hdl.handle.net/10722/2029862014-01-01T00:00:00Z
- The Minimum Number of Hubs in Networkshttp://hdl.handle.net/10722/202987Title: The Minimum Number of Hubs in Networks
Authors: XU, L; Han, G
Abstract: In this paper, a hub refers to a non-terminal vertex of degree at least three. We study the minimum number of hubs needed in a network to guarantee certain flow demand constraints imposed between multiple pairs of sources and sinks. We prove that under the constraints, regardless of the size of the network, such minimum number is always upper bounded and we derive tight upper bounds for some special parameters. In particular, for two pairs of sources and sinks, we present a novel path-searching algorithm, the analysis of which is instrumental for the derivations of the tight upper bounds. Our results are of both theoretical and practical interest: in theory, they can be viewed as generalizations of the classical Menger’s theorem to a class of undirected graphs with multiple sources and sinks; in practice, our results, roughly speaking, suggest that for some given flow demand constraints, not “too many” hubs are needed in a network.
Thu, 01 Jan 2015 00:00:00 GMThttp://hdl.handle.net/10722/2029872015-01-01T00:00:00Z
- Counterexample to the Vector Generalization of Costa’s Entropy Power Inequality, and Partial Resolutionhttp://hdl.handle.net/10722/258609Title: Counterexample to the Vector Generalization of Costa’s Entropy Power Inequality, and Partial Resolution
Authors: Courtade, T; Han, G; Wu, Y
Abstract: We give a counterexample to the vector generalization of Costa's entropy power inequality due to Liu et al. In particular, the claimed inequality can fail if the matrix-valued parameter in the convex combination does not commute with the covariance of the additive Gaussian noise. Conversely, the inequality holds if these two matrices commute.
Mon, 01 Jan 2018 00:00:00 GMThttp://hdl.handle.net/10722/2586092018-01-01T00:00:00Z
- Asymptotics of input-constrained erasure channel capacityhttp://hdl.handle.net/10722/258610Title: Asymptotics of input-constrained erasure channel capacity
Authors: Li, Y; Han, G
Mon, 01 Jan 2018 00:00:00 GMThttp://hdl.handle.net/10722/2586102018-01-01T00:00:00Z
- Feedback Capacity of Stationary Gaussian Channels Further Examinedhttp://hdl.handle.net/10722/279676Title: Feedback Capacity of Stationary Gaussian Channels Further Examined
Authors: Han, G; Liu, T
Abstract: It is well known that the problem of computing the feedback capacity of a stationary Gaussian channel can be recast as an infinite-dimensional optimization problem; moreover, necessary and sufficient conditions for the optimality of a solution to this optimization problem have been characterized, and based on this characterization, an explicit formula for the feedback capacity has been given for the case that the noise is a first-order autoregressive moving-average Gaussian process. In this paper, we further
examine the above-mentioned infinite-dimensional optimization problem. We prove that unless the Gaussian noise is white, its optimal solution is unique, and we propose an algorithm to recursively compute the unique optimal solution, which is guaranteed to converge in theory and features an efficient implementation for a suboptimal solution in practice. Furthermore, for the case that the noise is a k-th order autoregressive moving-average Gaussian process, we give a relatively more explicit formula for the feedback capacity; more specifically, the feedback capacity is expressed as a simple function evaluated at a solution to a system of polynomial equations, which is amenable to numerical computation for the cases k = 1,2 and possibly beyond.
Description: Session: Channel Capacity 1
Mon, 01 Jan 2018 00:00:00 GMThttp://hdl.handle.net/10722/2796762018-01-01T00:00:00Z
- An Elementary Proof of a Classical Information-Theoretic Formulahttp://hdl.handle.net/10722/279677Title: An Elementary Proof of a Classical Information-Theoretic Formula
Authors: Liu, X; Bustin, R; Han, G; Shamai, S
Abstract: A renowned information-theoretic formula by Shannon expresses the mutual information rate of a white Gaussian channel with a stationary Gaussian input as an integral of a simple function of the power spectral density of the channel input. We give in this paper a rigorous yet elementary proof of this classical formula. As opposed to all the conventional approaches, which either rely on heavy mathematical machineries or have to resort to some ``external'' results, our proof, which hinges on a recently proven sampling theorem, is elementary and self-contained, only using some well-known facts from basic calculus and matrix theory.
Description: Comfort Class: Information-Related
Tue, 01 Jan 2019 00:00:00 GMThttp://hdl.handle.net/10722/2796772019-01-01T00:00:00Z
- Information-Theoretic Extensions of the Shannon-Nyquist Sampling Theoremhttp://hdl.handle.net/10722/269902Title: Information-Theoretic Extensions of the Shannon-Nyquist Sampling Theorem
Authors: Han, G
Mon, 01 Jan 2018 00:00:00 GMThttp://hdl.handle.net/10722/2699022018-01-01T00:00:00Z
- An Elementary Proof of a Classical Information-Theoretic Formula http://hdl.handle.net/10722/277280Title: An Elementary Proof of a Classical Information-Theoretic Formula
Authors: Liu, X; Bustin, R; Han, G; Shamai, S
Abstract: A renowned information-theoretic formula by Shannon expresses the mutual information rate of a white Gaussian channel with a stationary Gaussian input as an integral of a simple function of the power spectral density of the channel input. We give in this paper a rigorous yet elementary proof of this classical formula. As opposed to all the conventional approaches, which either rely on heavy mathematical machineries or have to resort to some 'external' results, our proof, which hinges on a recently proven sampling theorem, is elementary and self-contained, only using some well-known facts from basic calculus and matrix theory.
Tue, 01 Jan 2019 00:00:00 GMThttp://hdl.handle.net/10722/2772802019-01-01T00:00:00Z
- A Deterministic Algorithm for the Capacity of Finite-State Channelshttp://hdl.handle.net/10722/277281Title: A Deterministic Algorithm for the Capacity of Finite-State Channels
Authors: Wu, C; Han, G; Marcus, B
Abstract: We propose a modified version of the classical gradient descent method to compute the capacity of finite-state channels with Markovian input. Under some concavity assumptions, our algorithm proves to achieve polynomial accuracy in polynomial time for general finite-state channels. Moreover, for some special families of finite-state channels, our algorithm can achieve exponential accuracy in polynomial time.
Tue, 01 Jan 2019 00:00:00 GMThttp://hdl.handle.net/10722/2772812019-01-01T00:00:00Z
- On the Capacity of the Flash Memory Channel with Inter-cell Interferencehttp://hdl.handle.net/10722/277282Title: On the Capacity of the Flash Memory Channel with Inter-cell Interference
Authors: Li, Y; Han, G; Siegel, P
Abstract: In this paper, we consider a discrete channel with inter-cell interference (ICI) as a model for NAND flash memory. We derive an explicit formula for the mutual information rate when the input is Markovian. Using this formula, we obtain the asymptotics of the channel capacity in the high signal-to-noise (SNR) regime.
Tue, 01 Jan 2019 00:00:00 GMThttp://hdl.handle.net/10722/2772822019-01-01T00:00:00Z
- Prime Factorization Theory of Networkshttp://hdl.handle.net/10722/209262Title: Prime Factorization Theory of Networks
Authors: Li, SYR; Han, G; Yang, Y
Sat, 01 Jan 2011 00:00:00 GMThttp://hdl.handle.net/10722/2092622011-01-01T00:00:00Z
- Analyticity of Entropy Rate of Hidden Markov Chains With Continuous Alphabethttp://hdl.handle.net/10722/218754Title: Analyticity of Entropy Rate of Hidden Markov Chains With Continuous Alphabet
Authors: Han, G; Marcus, B
Abstract: We first prove that under certain mild assumptions, the entropy rate of a hidden Markov chain, observed when passing a finite-state stationary Markov chain through a discrete-time continuous-output channel, is analytic with respect to the input Markov chain parameters. We then further prove, under strengthened assumptions on the chan- nel, that the entropy rate is jointly analytic as a function of both the input Markov chain parameters and the channel parameters. In particular, the main theorems estab- lish the analyticity of the entropy rate for two representative channels: Cauchy and Gaussian.
Thu, 01 Jan 2015 00:00:00 GMThttp://hdl.handle.net/10722/2187542015-01-01T00:00:00Z
- A Randomized Algorithm for the Capacity of Finite-State Channelshttp://hdl.handle.net/10722/217072Title: A Randomized Algorithm for the Capacity of Finite-State Channels
Authors: Han, G
Abstract: Inspired by ideas from the field of stochastic approximation, we propose a ran- domized algorithm to compute the capacity of a finite-state channel with a Markovian input. When the mutual information rate of the channel is concave with respect to the chosen parameterization, the proposed algorithm proves to be convergent to the ca- pacity of the channel almost surely with the derived convergence rate. We also discuss the convergence behavior of the algorithm without the concavity assumption.
Thu, 01 Jan 2015 00:00:00 GMThttp://hdl.handle.net/10722/2170722015-01-01T00:00:00Z
- On network coding advantage for multiple unicast networkshttp://hdl.handle.net/10722/218964Title: On network coding advantage for multiple unicast networks
Authors: Cai, K; Han, G
Abstract: In this paper, by studying the feasible fractional routing solution under the so-called full reachability condition, we give bounds on the network coding advantage for undirected multiple unicast networks. More precisely, we prove that, for certain class of fully reachable networks, the network coding advantage is upper bounded by 9/8, improving the previous bound 3 by M. Langberg and M. Médard.
Thu, 01 Jan 2015 00:00:00 GMThttp://hdl.handle.net/10722/2189642015-01-01T00:00:00Z
- On Connections between Stochastic Calculus and Information Theoryhttp://hdl.handle.net/10722/228526Title: On Connections between Stochastic Calculus and Information Theory
Authors: Liu, X; Han, G
Abstract: In this talk, making use of interesting connections between stochastic differential equations and information theory, we derive the capacity regions of several classes of continuous-time Gaussian channels. The 'complete' results obtained in this work stand in stark contrast to the status quo of network information theory in discrete-time, where the capacity regions of the all the above-mentioned channels are known only for a handful of special scenarios.
Description: MS-Th-E-10: Stochastic Dynamics with Applications - paper no. MS-Th-E-10-1
Thu, 01 Jan 2015 00:00:00 GMThttp://hdl.handle.net/10722/2285262015-01-01T00:00:00Z
- Coding advantage in communications among peershttp://hdl.handle.net/10722/232366Title: Coding advantage in communications among peers
Authors: Cai, K; Han, G
Abstract: We consider the problem of network coding advantage in a communication scenario where information exchange is bi-directional and peers communicate via multiple unicast sessions. In such a setting, we study the overall performance of all multiple unicast sessions and propose a version of the multiple unicast conjecture. One of our main results is a weaker version of the proposed conjecture: Consider all the multiple unicast sessions associated with a number of terminals in an undirected network. Then, the common transmission rate of all these multiple unicast sessions achieved by network coding in the sense of Langberg and Médard [13] can also be achieved by fractional routing. © 2016 IEEE.
Fri, 01 Jan 2016 00:00:00 GMThttp://hdl.handle.net/10722/2323662016-01-01T00:00:00Z
- Asymptotics of input-constrained binary symmetric channel capacityhttp://hdl.handle.net/10722/58965Title: Asymptotics of input-constrained binary symmetric channel capacity
Authors: Han, G; Marcus, B
Abstract: We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman [In Proceedings of IEEE Information Theory Workshop (2004) 117-122], we derive an asymptotic formula (when the noise parameter is small) for the entropy rate of a hidden Markov chain, observed when a Markov chain passes through a BSC. Using this result, we establish an asymptotic formula for the capacity of a BSC with input process supported on an irreducible finite type constraint, as the noise parameter tends to zero. © Institute of Mathematical Statistics, 2009.
Thu, 01 Jan 2009 00:00:00 GMThttp://hdl.handle.net/10722/589652009-01-01T00:00:00Z
- Limit theorems for the sample entropy of hidden Markov chainshttp://hdl.handle.net/10722/135893Title: Limit theorems for the sample entropy of hidden Markov chains
Authors: Han, G
Abstract: The Shannon-McMillan-Breiman theorem asserts that the sample entropy of a stationary and ergodic stochastic process converges to the entropy rate of the same process (as the sample size tends to infinity) almost surely. In this paper, we restrict our attention to the convergence behavior of the sample entropy of hidden Markov chains. Under certain positivity assumptions, we prove that a central limit theorem (CLT) with some Berry-Esseen bound for the sample entropy of a hidden Markov chain, and we use this CLT to establish a law of iterated logarithm (LIL) for the sample entropy. © 2011 IEEE.
Sat, 01 Jan 2011 00:00:00 GMThttp://hdl.handle.net/10722/1358932011-01-01T00:00:00Z
- Multi-criteria Student Project Allocation: A Case Study of Goal Programming Formulation with DSS Implementationhttp://hdl.handle.net/10722/100362Title: Multi-criteria Student Project Allocation: A Case Study of Goal Programming Formulation with DSS Implementation
Authors: Pan, L; Chu, SCK; Han, G; Huang, JZ
Abstract: This paper concerns a typical Student Project Allocation (SPA) problem involving distributing a set of projects to students of an undergraduate 'Directed Studies in Mathematics' course at the Department of Mathematics of the University of Hong Kong. Among a set of projects, each student indicates a preference list over their eligible projects, while the Department wants to make the most allocations, with its preferences over the individual students according to their GPA’s. We apply pre-emptive goal programming to a multi-criteria SPA model for allocating these projects to students with a DSS implementation. The numerical results illustrate the clear effectiveness and efficiency of this approach.
Description: The conference is organized by Asia-Pacific Operations Research Center (APORC) under APORS and the Chinese Academy of Sciences (CAS),
Thu, 01 Jan 2009 00:00:00 GMThttp://hdl.handle.net/10722/1003622009-01-01T00:00:00Z