Content deleted Content added
→Overview: Corrected improper capitalization of word "additive." |
m Open access bot: hdl added to citation with #oabot. |
||
Line 10:
== Overview ==
Stated by [[Claude Shannon]] in 1948, the theorem describes the maximum possible efficiency of [[error-correcting code|error-correcting methods]] versus levels of noise interference and data corruption. Shannon's theorem has wide-ranging applications in both communications and [[data storage device|data storage]]. This theorem is of foundational importance to the modern field of [[information theory]]. Shannon only gave an outline of the proof. The first rigorous proof for the discrete case is due to [[Amiel Feinstein]]<ref>{{cite journal |last1=Feinstein |first1=Amiel |title=A new basic theorem of information theory |journal=Transactions of the IRE Professional Group on Information Theory |date=September 1954 |volume=4 |issue=4 |pages=2–22 |doi=10.1109/TIT.1954.1057459 |hdl=1721.1/4798 |bibcode=1955PhDT........12F|hdl-access=free }}</ref> in 1954.
The Shannon theorem states that given a noisy channel with [[channel capacity]] ''C'' and information transmitted at a rate ''R'', then if <math>R < C</math> there exist [[code]]s that allow the [[probability of error]] at the receiver to be made arbitrarily small. This means that, theoretically, it is possible to transmit information nearly without error at any rate below a limiting rate, ''C''.
|