Legal Information
GOOD2USE Knowledge Network PC Shannon-Hartley Theorem

Good Knowledge Is Good2Use
In information theory, the Shannon-Hartley theorem describes the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is an application of the noisy-channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise.

The theorem establishes Shannon's channel capacity for such a communication link. This boundary is the maximum amount of error-free information per time unit that can be transmitted with a specified bandwidth in the presence of the noise interference. The assumption is that the signal power is bounded, and that the Gaussian noise process is characterized by a known power or power spectral density.
The theorem states the channel capacity C, the theoretical tightest upper bound on the information rate of data that can be communicated at an arbitrarily low error rate using an average received signal power S through an analog communication channel subject to additive white Gaussian noise of power N.

C = B * log2(1 +
S / N
)
where

Nyquist determined that the number of independent pulses that could be put through a telegraph channel per unit time is limited to twice the bandwidth of the channel. In symbols,

fp = 2B
where

The quantity 2B later came to be called the Nyquist rate, and transmitting at the limiting pulse rate of 2B pulses per second as signalling at the Nyquist rate.

Hartley formulated a way to quantify information and its line rate (also known as data signalling rate R bits per second). This method, later known as Hartley's law, became an important precursor for Shannon's more sophisticated notion of channel capacity.
Hartley argued that the maximum number of distinguishable pulse levels that can be transmitted and received reliably over a communications channel is limited by the dynamic range of the signal amplitude and the precision with which the receiver can distinguish amplitude levels. Specifically, if the amplitude of the transmitted signal is restricted to the range of [-A ... +A] volts, and the precision of the receiver is ±?V volts, then the maximum number of distinct pulses M is given by

M = 1 +
A / ΔV
By taking information per pulse in bit/pulse to be the base-2-logarithm of the number of distinct messages M that could be sent, Hartley constructed a measure of the line rate R as:
R = fp * log 2( M )
where fp is the pulse rate, also known as the symbol rate, in symbols/second or baud.

Combining the above quantification with Nyquist's observation that the number of independent pulses that could be put through a channel of bandwidth B hertz was 2B pulses per second, gives rise to a quantitative measure for achievable line rate.
Hartley's law is sometimes quoted as just a proportionality between the analog bandwidth, Bin Hertz and what is called the digital bandwidth, R in bit/s.
On other occasions it is quoted in this more quantitative form, as an achievable line rate of R bits per secon

R ≤ 2B * log2(M)


Search Knowledge Base Feedback
If you like our web site refer a friend.
Your friends name.
Your friends email address.
Your Name
Your Email Address


© Copyright 1998-1999 GOOD2USE