Content deleted Content added
No edit summary Tags: Reverted Mobile edit Mobile web edit |
Undid revision 1076821320 by 49.206.120.138 (talk); test edits? |
||
Line 3:
{{Information theory}}
{{redirect|Shannon's theorem|text=Shannon's name is
In [[information theory]], the '''noisy-channel coding theorem''' (sometimes '''Shannon's theorem''' or '''Shannon's limit'''), establishes that for any given degree of [[Noisy channel model|noise contamination of a communication channel]], it is possible to communicate discrete data (digital [[information]]) nearly error-free up to a computable maximum rate through the channel. This result was presented by [[Claude Shannon]] in 1948 and was based in part on earlier work and ideas of [[Harry Nyquist]] and [[Ralph Hartley]].
The '''Shannon limit''' or '''Shannon capacity''' of a communication channel refers to the maximum [[Code rate|rate]] of error-free data that can theoretically be transferred over the channel
== Overview ==
|