Legal Information
GOOD2USE Knowledge Network PC Noisy Channel Theorem

Good Knowledge Is Good2Use
The noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel.
. The theorem describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption.

Shannon's theorem has wide-ranging applications in both communications and data storage. This theorem is of fundamental importance to the field of information theory. The Shannon theorem states that given a noisy channel with channel capacity C and information transmitted at a rate R, then if R < C there exist codes that allow the probability of error at the receiver to be made arbitrarily small. This means that, theoretically, it is possible to transmit information nearly without error at any rate below a limiting rate, C.The channel capacity C can be calculated from the physical properties of a channel; for a band-limited channel with Gaussian noise, using the Shannon-Hartley theorem.

Simple schemes such as "send the message 3 times and use a best 2 out of 3 voting scheme if the copies differ" are inefficient error-correction methods. They are unable to asymptotically guarantee that a block of data can be communicated free of error. Advanced techniques such as Reed-Solomon codes and low-density parity-check (LDPC) codes and turbo codes, come much closer to reaching the theoretical Shannon limit, but at a cost of high computational complexity.
Using these highly efficient codes and with modern computing power it is possible to reach very close to the Shannon limit.

Search Knowledge Base Feedback

If you like our web site refer a friend.
Your friends name.
Your friends email address.
Your Name
Your Email Address


© Copyright 1998-1999 GOOD2USE