Noisy-channel coding theorem: Difference between revisions

Content deleted Content added
reverted my previous change
→cite journal, book, tweak cites | Alter: issue, doi. Add: s2cid, authors 1-1. Removed parameters. Some additions/deletions were parameter name changes. | Use this tool. Report bugs. | #UCB_Gadget
Line 10:
== Overview ==
 
Stated by [[Claude Shannon]] in 1948, the theorem describes the maximum possible efficiency of [[error-correcting code|error-correcting methods]] versus levels of noise interference and data corruption. Shannon's theorem has wide-ranging applications in both communications and [[data storage device|data storage]]. This theorem is of foundational importance to the modern field of [[information theory]]. Shannon only gave an outline of the proof. The first rigorous proof for the discrete case is duegiven toin [[Amiel Feinstein]]<ref>{{cite journal harv|last1=Feinstein |first1=Amiel |title=A new basic theorem of information theory |journal=Transactions of the IRE Professional Group on Information Theory |date=September 1954 |volume=4 |issue=4 |pages=2–22 |doi=10.1109/TIT.1954.1057459 |hdl=1721.1/4798 |bibcode=1955PhDT........12F|hdl-access=free }}</ref> in 1954.
 
The Shannon theorem states that given a noisy channel with [[channel capacity]] ''C'' and information transmitted at a rate ''R'', then if <math>R < C</math> there exist [[code]]s that allow the [[probability of error]] at the receiver to be made arbitrarily small. This means that, theoretically, it is possible to transmit information nearly without error at any rate below a limiting rate, ''C''.
Line 18:
The channel capacity <math>C</math> can be calculated from the physical properties of a channel; for a band-limited channel with Gaussian noise, using the [[Shannon–Hartley theorem]].
 
Simple schemes such as "send the message 3 times and use a best 2 out of 3 voting scheme if the copies differ" are inefficient error-correction methods, unable to asymptotically guarantee that a block of data can be communicated free of error. Advanced techniques such as [[Reed–Solomon code]]s and, more recently, [[low-density parity-check code|low-density parity-check]] (LDPC) codes and [[turbo code]]s, come much closer to reaching the theoretical Shannon limit, but at a cost of high computational complexity. Using these highly efficient codes and with the computing power in today's [[digital signal processors]], it is now possible to reach very close to the Shannon limit. In fact, it was shown that LDPC codes can reach within 0.0045&nbsp;dB of the Shannon limit (for binary [[additive white Gaussian noise]] (AWGN) channels, with very long block lengths).<ref>[[Sae-Young{{cite Chung]],journal [[ |author2-link=G. David Forney, Jr.]], [[Thomas J.|author4-link=Rüdiger Richardson]],Urbanke and|author1=Sae-Young [[RüdigerChung Urbanke]],|first2=G. "[http://wwwD.josephboutros |last2=Forney |first3=T.org/ldpc_vs_turbo/ldpc_Chung_CLfeb01J.pdf |last3=Richardson |first4=R. |last4=Urbank |title=On the Design of Low-Density Parity-Check Codes within 0.0045 dB of the Shannon Limit]", ''[[|journal=IEEE Communications Letters]]'', |volume=5: 58-60,|issue=2 Feb.|pages=58–60 |date=February 2001 |doi=10.1109/4234.905935 ISSN|s2cid=7381972 1089-7798|url=http://www.josephboutros.org/ldpc_vs_turbo/ldpc_Chung_CLfeb01.pdf}}</ref>
 
== Mathematical statement ==
Line 123:
=== Strong converse for discrete memoryless channels ===
 
A strong converse theorem, proven by Wolfowitz in 1957,<ref>{{cite book |first=Robert |last=Gallager. ''|title=Information Theory and Reliable Communication.'' New York: [[John |publisher=Wiley & Sons]], |date=1968. {{ISBN|isbn=0-471-29048-3 }}</ref> states that,
 
: <math>
Line 167:
{{reflist}}
==References ==
*{{cite [[book |author1-link=Thomas M. Cover |last1=Cover |first1=T. M.]], |last2=Thomas |first2=J. A., ''|title=Elements of Information Theory'', [[John |publisher=Wiley & Sons]], |date=1991. {{ISBN|isbn=0-471-06259-6 }}
*{{cite book [[Fano|author1-link=Robert Fano, |first=R. AM.]], ''|last=Fano |title=Transmission of information; a statistical theory of communications'', [[|publisher=MIT Press]], |date=1961. {{ISBN|isbn=0-262-06001-9 }}
*{{cite [[Amieljournal Feinstein|last1=Feinstein, |first1=Amiel]], "|title=A Newnew basic theorem of information theory", ''[[IEEE |journal=Transactions of the IRE Professional Group on Information Theory]]'', |date=September 1954 |volume=4( |issue=4): 2-22,|pages=2–22 |doi=10.1109/TIT.1954.1057459 |hdl=1721.1/4798 |bibcode=1955PhDT........12F|hdl-access=free }}
*{{cite [[book |author1-link=David J.C. MacKay|MacKay, |first=David J. C.]], ''[http://www.inference.phy.cam.ac.uk/mackay/itila/book.html|last=MacKay |title=Information Theory, Inference, and Learning Algorithms]'', [[|publisher=Cambridge University Press]], |date=2003. {{ISBN|isbn=0-521-64298-1 |pages= |url=http://www.inference.phy.cam.ac.uk/mackay/itila/book.html}} [free online]
* {{cite journal|author-link=Claude E. Shannon|url=https://ieeexplore.ieee.org/document/6773024 | doi=10.1002/j.1538-7305.1948.tb01338.x | title=A Mathematical Theory of Communication | year=1948 | last1=Shannon | first1=C. E. | journal=Bell System Technical Journal | volume=27 | issue=3 | pages=379–423 }}
*{{cite [[book |author-link=Claude E. Shannon|Shannon, |first=C. E.]], [http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html|last=Shannon ''|title=A Mathematical Theory of Communication''] Urbana, IL: |publisher=University of Illinois Press, |orig-year=1948 (reprinted |date=1998) |pages= |url=http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html}}
*{{cite journal [[Wolfowitz|Wolfowitz, first=J.]], "[https://projecteuclid.org/download/pdf_1/euclid.ijm/1255380682|last=Wolfowitz |title=The coding of messages subject to chance errors]", ''|journal=Illinois J. Math.'', |volume=1: |issue= 4|pages=591–606, |date=1957 |doi= 10.1215/ijm/1255380682|url=https://projecteuclid.org/download/pdf_1/euclid.ijm/1255380682}}
*{{cite journal |first=Lars |last=Lundheim |title=On Shannon and Shannon's Formula |journal=Telektronik |volume=98 |issue=1 |pages=20–29 |date=2002 |doi= |url=http://www.cs.miami.edu/home/burt/learning/Csc524.142/LarsTelektronikk02.pdf}}
 
*{{cite web |first=B. |last=Aazhang |title=Shannon's Noisy Channel Coding Theorem |date=2004 |work=Connections |publisher= |url=https://www.cse.iitd.ac.in/~vinay/courses/CSL858/reading/m10180.pdf}}
== External links ==
 
* [http://www.cs.miami.edu/home/burt/learning/Csc524.142/LarsTelektronikk02.pdf On Shannon and Shannon's law]
* [http://cnx.org/content/m10180/latest/ Shannon's Noisy Channel Coding Theorem]
 
{{DEFAULTSORT:Noisy-Channel Coding Theorem}}