Conditions for Information Capacity of the Discrete-Time Gaussian Channel to be Increased by Feedback

Conditions for Information Capacity of the Discrete-Time Gaussian Channel to be Increased by Feedback PDF Author: Charles R. Baker
Publisher:
ISBN:
Category :
Languages : en
Pages : 25

Get Book Here

Book Description
Sufficient conditions are given for optimal causal feedback to increase information capacity for the discrete-time additive Gaussian channel. The conditions are obtained by assuming linear feedback and reformulating the problem into an equivalent no feedback problem. Keywords: Channel capacity; Shannon theory; Information theory; Channels with feedback; Gaussian channels.

Conditions for Information Capacity of the Discrete-Time Gaussian Channel to be Increased by Feedback

Conditions for Information Capacity of the Discrete-Time Gaussian Channel to be Increased by Feedback PDF Author: Charles R. Baker
Publisher:
ISBN:
Category :
Languages : en
Pages : 25

Get Book Here

Book Description
Sufficient conditions are given for optimal causal feedback to increase information capacity for the discrete-time additive Gaussian channel. The conditions are obtained by assuming linear feedback and reformulating the problem into an equivalent no feedback problem. Keywords: Channel capacity; Shannon theory; Information theory; Channels with feedback; Gaussian channels.

Information Capacity of Gaussian Channels

Information Capacity of Gaussian Channels PDF Author: Charles R. Baker
Publisher:
ISBN:
Category :
Languages : en
Pages : 23

Get Book Here

Book Description
Information capacity of Gaussian channels is one of the basic problems of information theory. Shannon's results for white Gaussian channels and Fano's waterfilling analysis of stationary Gaussian channnels are two of the best-known works of early information theory. Results are given here which extend to a general framework these results and others due to Gallager and to Kadota, Zakai, and Ziv. The development applies to arbitrary Gaussian channels when the channel noise has sample paths in a separable Banach space, and to a large class of Gaussian channels when the noise has sample paths in a linear topological vector space. Solutions for the capacity are given for both matched and mismatched channels. Keywords: Gaussian channels; Channel capacity; Shannon theory; Information theory.

On the Information Capacity of Peak and Average Power Constrained Gaussian Channels

On the Information Capacity of Peak and Average Power Constrained Gaussian Channels PDF Author: Joel Gorham Smith
Publisher:
ISBN:
Category :
Languages : en
Pages : 246

Get Book Here

Book Description


Information and Coding Capacities of Mismatched Gaussian Channels

Information and Coding Capacities of Mismatched Gaussian Channels PDF Author: Charles R. Baker
Publisher:
ISBN:
Category :
Languages : en
Pages : 12

Get Book Here

Book Description
Recent results on coding capacity and information capacity for the mismatched Gaussian channel are discussed. Sufficient conditions for causal feedback to increase information capacity are given for the finite-dimensional discrete-time Gaussian channel. Keywords: Gaussian channels; Channel capacity; Shannon theory; Information theory.

Information and Communication Theory

Information and Communication Theory PDF Author: Stefan Host
Publisher: John Wiley & Sons
ISBN: 1119433800
Category : Technology & Engineering
Languages : en
Pages : 368

Get Book Here

Book Description
An important text that offers an in-depth guide to how information theory sets the boundaries for data communication In an accessible and practical style, Information and Communication Theory explores the topic of information theory and includes concrete tools that are appropriate for real-life communication systems. The text investigates the connection between theoretical and practical applications through a wide-variety of topics including an introduction to the basics of probability theory, information, (lossless) source coding, typical sequences as a central concept, channel coding, continuous random variables, Gaussian channels, discrete input continuous channels, and a brief look at rate distortion theory. The author explains the fundamental theory together with typical compression algorithms and how they are used in reality. He moves on to review source coding and how much a source can be compressed, and also explains algorithms such as the LZ family with applications to e.g. zip or png. In addition to exploring the channel coding theorem, the book includes illustrative examples of codes. This comprehensive text: Provides an adaptive version of Huffman coding that estimates source distribution Contains a series of problems that enhance an understanding of information presented in the text Covers a variety of topics including optimal source coding, channel coding, modulation and much more Includes appendices that explore probability distributions and the sampling theorem Written for graduate and undergraduate students studying information theory, as well as professional engineers, master’s students, Information and Communication Theory offers an introduction to how information theory sets the boundaries for data communication.

On the Capacity of Channels with Gaussian and Non-Gaussian Noise

On the Capacity of Channels with Gaussian and Non-Gaussian Noise PDF Author: I. W. McKeague
Publisher:
ISBN:
Category :
Languages : en
Pages : 21

Get Book Here

Book Description
We evaluate the information capacity of channels for which the noise process is a Gaussian measure on a quasi-complete locally convex space. The coding capacity is calculated in this setting and for time-continuous Gaussian channels using the information capacity result. The coding capacity of channels with non-Gaussian noise having finite entropy with respect to Gaussian noise of the same covariance is shown not to exceed the coding capacity of the Gaussian channel. The sensitivity of the information capacity to deviations from normality in the noise process is also investigated. (Author).

Capacity of Generalized Mismatched Gaussian Channels

Capacity of Generalized Mismatched Gaussian Channels PDF Author: C. R. Baker
Publisher:
ISBN:
Category :
Languages : en
Pages : 20

Get Book Here

Book Description
Information capacity is determined for a Gaussian communication channel when the constraint is given in terms of a covariance which is different from that of the channel noise.

Capacity of Gaussian Noise Channels with Side Information and Feedback

Capacity of Gaussian Noise Channels with Side Information and Feedback PDF Author: Thierry Etienne Klein
Publisher:
ISBN:
Category :
Languages : en
Pages : 306

Get Book Here

Book Description


Information Capacity of the Mismatched Gaussian Channel

Information Capacity of the Mismatched Gaussian Channel PDF Author: Charles R. Baker
Publisher:
ISBN:
Category :
Languages : en
Pages : 38

Get Book Here

Book Description
Information capacity is determined for the additive Gaussian channel when the constraint is given in terms of a covariance different from that of the channel noise. These results, combined with previous results on capacity when the constraint covariance is the same as the noise covariance, provide a complete and general solution for the information capacity of the Gaussian channel without feedback. They are valid for both continuous-time and discrete-time channels, and require only two assumptions: the noise energy over the observation period is finite (w.p.l.), and the constraint is given in terms of a reproducing kernel Hilbert space norm. Applications include channels with ambient noise having unknown covariance, and channels subject to jamming. The results for the mismatched channel differ markedly from those for the matched channel.

Capacity of Mismatched Gaussian Channels

Capacity of Mismatched Gaussian Channels PDF Author: C. R. Baker
Publisher:
ISBN:
Category :
Languages : en
Pages : 15

Get Book Here

Book Description
The capacity of the Gaussian channel without feedback, subject to a generalized energy constraint, is determined in an earlier document, In that work, the constraint is given in terms of the covariance of the channel noise process. However, these are many situation where one may wish to determine capacity subject to a constraint determined by a covariance that is different form that of the channel noise. An example is in jamming or countermeasures situations. Channels where the covariance of the noise is the same as that of the constraint will be called matched channels; otherwise, we say that the channel is mismatched (to the constraint). In this paper, the capacity of the mismatched Gaussian channel is determined for two situations; the finite-dimensional channel, and the infinite-dimensional channel with a dimensionality constraint on the space of transmitted signals. Results on the infinite-dimensional mismatched channel without a dimensionality constraint on the signal are given elsewhere. Various special cases of the mismatched channel have been treated previously. The results for the mismatched channel differ significantly from those for the matched channel. A discussion of these differences follows the proof of the main result.