Content deleted Content added
Line 11:
:<math> R < C \,</math>
there exists a coding technique which allows the probability of error at the receiver to be made arbitrarily small. This means that theoretically, it is possible to transmit information without error up to a
The converse is also important. If
Line 17:
:<math> R > C \,</math>
an arbitrarily small probability of error is not achievable. So, information cannot be guaranteed to be transmitted reliably across a channel at rates beyond the channel capacity. The theorem does not address the rare situation in which rate and capacity are equal.
Simple schemes such as "send the message 3 times and use at best 2 out of 3 voting scheme if the copies differ" are inefficient error-correction methods, unable to asymptotically guarantee that a block of data can be communicated free of error. Advanced techniques such as [[Reed-Solomon code]]s and, more recently, [[Turbo code]]s come much closer to reaching the theoretical Shannon limit, but at a cost of high computational complexity. With Turbo codes and the computing power in today's [[digital signal processors]], it is now possible to reach within 1/10 of one [[decibel]] of the Shannon limit.
|