Content deleted Content added
m push mutual information link up, fix cite journal to remove warning |
removed incorrect link |
||
Line 5:
{{redirect|Shannon's theorem|text=Shannon's name is also associated with the [[sampling theorem]]}}
In [[information theory]], the '''noisy-channel coding theorem''' (sometimes '''Shannon's theorem''' or '''Shannon's limit'''), establishes that for any given degree of
The '''Shannon limit''' or '''Shannon capacity''' of a communication channel refers to the maximum [[Code rate|rate]] of error-free data that can theoretically be transferred over the channel if the link is subject to random data transmission errors, for a particular noise level. It was first described by Shannon (1948), and shortly after published in a book by Shannon and [[Warren Weaver]] entitled ''[[The Mathematical Theory of Communication]]'' (1949). This founded the modern discipline of [[information theory]].
|