Content deleted Content added
→cite journal, book, tweak cites | Alter: issue, doi. Add: s2cid, authors 1-1. Removed parameters. Some additions/deletions were parameter name changes. | Use this tool. Report bugs. | #UCB_Gadget |
m Hartley |
||
(7 intermediate revisions by 7 users not shown) | |||
Line 1:
{{short description|Limit on data transfer rate}}
{{redirect|Shannon's theorem|text=Shannon's name is also associated with the [[sampling theorem]]}}
In [[information theory]], the '''noisy-channel coding theorem''' (sometimes '''Shannon's theorem''' or '''Shannon's limit'''), establishes that for any given degree of noise contamination of a communication channel, it is possible (in theory) to communicate discrete data (digital [[information]]) nearly error-free up to a computable maximum rate through the channel. This result was presented by [[Claude Shannon]] in 1948 and was based in part on earlier work and ideas of [[Harry Nyquist]] and [[Ralph Hartley]].
The '''Shannon limit''' or '''Shannon capacity''' of a communication channel refers to the maximum [[Code rate|rate]] of error-free data that can theoretically be transferred over the channel if the link is subject to random data transmission errors, for a particular noise level. It was first described by Shannon (1948), and shortly after published in a book by Shannon and [[Warren Weaver]] entitled ''[[The Mathematical Theory of Communication]]'' (1949). This founded the modern discipline of [[information theory]].
Line 21 ⟶ 19:
== Mathematical statement ==
[[Image:Noisy-channel coding theorem — channel capacity graph.png|thumb|right|300px|Graph showing the proportion of a channel’s capacity (''y''-axis) that can be used for payload based on how noisy the channel is (probability of bit flips; ''x''-axis)]]
The basic mathematical model for a communication system is the following:
Line 82:
# The message W is sent across the channel.
# The receiver receives a sequence according to <math>P(y^n|x^n(w))= \prod_{i = 1}^np(y_i|x_i(w))</math>
# Sending these codewords across the channel, we receive <math>Y_1^n</math>, and decode to some source sequence if there exists exactly 1 codeword that is jointly typical with Y. If there are no jointly typical codewords, or if there are more than one, an error is declared. An error also occurs if a decoded codeword
The probability of error of this scheme is divided into two parts:
Line 167:
{{reflist}}
==References ==
*{{cite web |first=B. |last=Aazhang |title=Shannon's Noisy Channel Coding Theorem |date=2004 |work=Connections |publisher= |url=https://www.cse.iitd.ac.in/~vinay/courses/CSL858/reading/m10180.pdf}}▼
*{{cite book |author1-link=Thomas M. Cover |last1=Cover |first1=T.M. |last2=Thomas |first2=J.A. |title=Elements of Information Theory |publisher=Wiley |date=1991 |isbn=0-471-06259-6 }}
*{{cite book |author1-link=Robert Fano |first=R.M. |last=Fano |title=Transmission of information; a statistical theory of communications |publisher=MIT Press |date=1961 |isbn=0-262-06001-9 }}
*{{cite journal |last1=Feinstein |first1=Amiel |title=A new basic theorem of information theory |journal=Transactions of the IRE Professional Group on Information Theory |date=September 1954 |volume=4 |issue=4 |pages=2–22 |doi=10.1109/TIT.1954.1057459 |hdl=1721.1/4798 |bibcode=1955PhDT........12F|hdl-access=free }}
*{{cite journal |first=Lars |last=Lundheim |title=On Shannon and Shannon's Formula |journal=Telektronik |volume=98 |issue=1 |pages=20–29 |date=2002 |doi= |url=http://www.cs.miami.edu/home/burt/learning/Csc524.142/LarsTelektronikk02.pdf}}▼
*{{cite book |author1-link=David J.C. MacKay |first=David J.C. |last=MacKay |title=Information Theory, Inference, and Learning Algorithms |publisher=Cambridge University Press |date=2003 |isbn=0-521-64298-1 |pages= |url=http://www.inference.phy.cam.ac.uk/mackay/itila/book.html}} [free online]
*{{cite journal|author-link=Claude E. Shannon | doi=10.1002/j.1538-7305.1948.tb01338.x | title=A Mathematical Theory of Communication | year=1948 | last1=Shannon | first1=C. E. | journal=Bell System Technical Journal | volume=27 | issue=3 | pages=379–423 }}
*{{cite book |author-link=Claude E. Shannon |first=C.E. |last=Shannon |title=A Mathematical Theory of Communication |publisher=University of Illinois Press |orig-year=1948 |date=1998 |pages= |url=http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html}}
*{{cite journal |first=J. |last=Wolfowitz |title=The coding of messages subject to chance errors |journal=Illinois J. Math. |volume=1 |issue= 4|pages=591–606 |date=1957 |doi= 10.1215/ijm/1255380682|url=https://projecteuclid.org/download/pdf_1/euclid.ijm/1255380682|doi-access=free }}
▲*{{cite journal |first=Lars |last=Lundheim |title=On Shannon and Shannon's Formula |journal=Telektronik |volume=98 |issue=1 |pages=20–29 |date=2002 |doi= |url=http://www.cs.miami.edu/home/burt/learning/Csc524.142/LarsTelektronikk02.pdf}}
▲*{{cite web |first=B. |last=Aazhang |title=Shannon's Noisy Channel Coding Theorem |date=2004 |work=Connections |publisher= |url=https://www.cse.iitd.ac.in/~vinay/courses/CSL858/reading/m10180.pdf}}
{{DEFAULTSORT:Noisy-Channel Coding Theorem}}
|