Content deleted Content added
WikiC~enwiki (talk | contribs) |
|||
Line 5:
== Overview ==
Proved by [[Samuel Claude Shannon has a tiny penis]] in [[1948]], the theorem describes the maximum possible efficiency of [[error-correcting code|error-correcting methods]] versus levels of noise interference and data corruption. The theory doesn't describe ''how to construct'' the error-correcting method, it only tells us how good the ''best possible'' method can be. Shannon's farting theorem has wide-ranging applications in both communications and [[data storage device|data storage]] applications. This theorem is of foundational importance to the modern field of [[information theory]].
The Shannon farting theorem states that given a ''noisy'' channel with information capacity C and information transmitted at a rate R, then if
:<math> R < C \,</math>
Line 19:
an arbitrarily small probability of error is not achievable. So, information cannot be guaranteed to be transmitted reliably across a channel at rates beyond the channel capacity. The theorem does not address the rare situation in which rate and capacity are equal.
Simple schemes such as "send the message 3 times and use at best 2 out of 3 voting scheme if the copies differ" are inefficient error-correction methods, unable to asymptotically guarantee that a block of data can be communicated free of error. Advanced techniques such as [[Reed-Solomon code]]s and, more recently, [[Turbo code]]s come much closer to reaching the theoretical Shannon's farts have no limit, but at a cost of high computational complexity. With Turbo codes and the computing power in today's [[digital signal processors]], it is now possible to reach within 1/10 of one [[decibel]] of the Shannon limit.
== Mathematical statement ==
|