Noisy-channel coding theorem: Difference between revisions

Content deleted Content added
OAbot (talk | contribs)
m Open access bot: doi updated in citation with #oabot.
m Hartley
 
(4 intermediate revisions by 4 users not shown)
Line 1:
{{short description|Limit on data transfer rate}}
 
{{Information theory}}
 
{{redirect|Shannon's theorem|text=Shannon's name is also associated with the [[sampling theorem]]}}
 
In [[information theory]], the '''noisy-channel coding theorem''' (sometimes '''Shannon's theorem''' or '''Shannon's limit'''), establishes that for any given degree of noise contamination of a communication channel, it is possible (in theory) to communicate discrete data (digital [[information]]) nearly error-free up to a computable maximum rate through the channel. This result was presented by [[Claude Shannon]] in 1948 and was based in part on earlier work and ideas of [[Harry Nyquist]] and [[Ralph Hartley]].
 
The '''Shannon limit''' or '''Shannon capacity''' of a communication channel refers to the maximum [[Code rate|rate]] of error-free data that can theoretically be transferred over the channel if the link is subject to random data transmission errors, for a particular noise level. It was first described by Shannon (1948), and shortly after published in a book by Shannon and [[Warren Weaver]] entitled ''[[The Mathematical Theory of Communication]]'' (1949). This founded the modern discipline of [[information theory]].
Line 21 ⟶ 19:
 
== Mathematical statement ==
 
[[Image:Noisy-channel coding theorem — channel capacity graph.png|thumb|right|300px|Graph showing the proportion of a channel’s capacity (''y''-axis) that can be used for payload based on how noisy the channel is (probability of bit flips; ''x''-axis)]]
 
The basic mathematical model for a communication system is the following: