Content deleted Content added
LouScheffer (talk | contribs) →top: Wording, make unclear antecedent explicit |
Citation bot (talk | contribs) Alter: journal, date, template type. Add: bibcode, s2cid, authors 1-1. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox | #UCB_webform_linked 994/1304 |
||
Line 5:
In [[information theory]], a '''low-density parity-check''' ('''LDPC''') '''code''' is a [[Linear code|linear]] [[error correcting code]], a method of transmitting a message over a [[signal noise|noisy]] transmission channel.<ref>{{cite book |author-link=David J.C. MacKay |first=David J. |last=MacKay |title=Information theory, Inference and Learning Algorithms |publisher=Cambridge University Press |date=2003 |isbn=0-521-64298-1 |pages= |url=http://www.inference.phy.cam.ac.uk/mackay/itila/book.html}}</ref><ref>{{cite book |author-link=Todd K. Moon |first=Todd K. |last=Moon |title=Error Correction Coding, Mathematical Methods and Algorithms |publisher=Wiley |date=2005 |isbn=0-471-64800-0 |pages= |url=}} (Includes code)</ref> An LDPC code is constructed using a sparse [[Tanner graph]] (subclass of the [[bipartite graph]]).<ref>[[Amin Shokrollahi]] (2003) LDPC Codes: An Introduction</ref> LDPC codes are [[:Category:capacity-approaching codes|capacity-approaching codes]], which means that practical constructions exist that allow the noise threshold to be set very close to the theoretical maximum (the [[Shannon–Hartley theorem|Shannon limit]]) for a symmetric memoryless channel. The noise threshold defines an upper bound for the channel noise, up to which the probability of lost information can be made as small as desired. Using iterative [[belief propagation]] techniques, LDPC codes can be decoded in time linear in their block length.
LDPC codes are also known as '''Gallager codes''', in honor of [[Robert G. Gallager]], who developed the LDPC concept in his doctoral dissertation at the [[Massachusetts Institute of Technology]] in 1960.<ref>{{Cite news |url=http://web.mit.edu/newsoffice/2010/gallager-codes-0121.html |title=Explained: Gallager codes |first=L. |last=Hardesty |journal=MIT News |date= January 21, 2010 |access-date= August 7, 2013 }}</ref><ref name="G1962">{{cite journal |first=R.G. |last=Gallager |title=Low density parity check codes |journal=IRE Trans. Inf. Theory |volume=8 |issue=1 |pages=21–28 |date=January 1962 |doi=10.1109/TIT.1962.1057683 |hdl=1721.1/11804/32786367-MIT|s2cid=260490814 }}</ref> However, LDPC codes require computationally expensive iterative decoding, so they went unused for decades. In 1993 the newly invented [[turbo code]]s demonstrated that codes with iterative decoding could far outperform other codes used at that time, but turbo codes were patented and required a fee for use. This raised renewed interest in LDPC codes, which were shown to have similar performance, but were much older and patent-free.<ref name="Closing">{{cite journal |author=Erico Guizzo |title=CLOSING IN ON THE PERFECT CODE |journal=IEEE Spectrum |date=Mar 1, 2004 |url=https://spectrum.ieee.org/closing-in-on-the-perfect-code}} "Another advantage, perhaps the biggest of all, is that the LDPC patents have expired, so companies can use them without having to pay for intellectual-property rights."</ref> Now that the fundamental patent for turbo codes has expired (on August 29, 2013),<ref>{{cite patent |url=https://www.google.com/patents/US5446747 |country=US |number=5446747}}</ref><ref>{{cite journal |journal=New Scientist |title=Communication speed nears terminal velocity |first=D. |last=Mackenzie |date=9 July 2005}}</ref> LDPC codes are still used for their technical merits.
LDPC codes have been shown to have ideal combinatorial properties. In his dissertation, Gallager showed that LDPC codes achieve the [[Gilbert–Varshamov bound for linear codes]] over binary fields with high probability. In 2020 it was shown that Gallager's LDPC codes achieve [[list decoding]] capacity and also achieve the [[Gilbert–Varshamov bound for linear codes]] over general fields.<ref name="MRRSW20">{{cite journal |
==History==
Line 197:
=== Updating node information ===
In recent years{{When|date=May 2023}}, there has also been a great deal of work spent studying the effects of alternative schedules for variable-node and constraint-node update. The original technique that was used for decoding LDPC codes was known as ''flooding''. This type of update required that, before updating a variable node, all constraint nodes needed to be updated and vice versa. In later work by Vila Casado ''et al.'',<ref name=Vila07>{{cite conference |
The intuition behind these algorithms is that variable nodes whose values vary the most are the ones that need to be updated first. Highly reliable nodes, whose [[log-likelihood ratio]] (LLR) magnitude is large and does not change significantly from one update to the next, do not require updates with the same frequency as other nodes, whose sign and magnitude fluctuate more widely.{{Citation needed|date=May 2023}}
These scheduling algorithms show greater speed of convergence and lower error floors than those that use flooding. These lower error floors are achieved by the ability of the Informed Dynamic Scheduling (IDS)<ref name=Vila07/> algorithm to overcome trapping sets of near codewords.<ref name=Richardson03>{{cite journal |first=T. |last=Richardson |title=Error floors of LDPC codes |journal=Proceedings of the
When nonflooding scheduling algorithms are used, an alternative definition of iteration is used. For an (''n'', ''k'') LDPC code of rate ''k''/''n'', a full ''iteration'' occurs when ''n'' variable and ''n'' − ''k'' constraint nodes have been updated, no matter the order in which they were updated.{{Citation needed|date=May 2023}}
== Code construction ==
For large block sizes, LDPC codes are commonly constructed by first studying the behaviour of decoders. As the block size tends to infinity, LDPC decoders can be shown to have a noise threshold below which decoding is reliably achieved, and above which decoding is not achieved,<ref name=richardson01>{{cite journal |
The construction of a specific LDPC code after this optimization falls into two main types of techniques:{{Citation needed|date=May 2023}}
Line 212:
*Combinatorial approaches
Construction by a pseudo-random approach builds on theoretical results that, for large block size, a random construction gives good decoding performance.<ref name=MacKay96/> In general, pseudorandom codes have complex encoders, but pseudorandom codes with the best decoders can have simple encoders.<ref name=richardson01b>{{cite journal |
Combinatorial approaches can be used to optimize the properties of small block-size LDPC codes or to create codes with simple encoders.{{Citation needed|date=May 2023}}
Line 220:
[http://www.eecg.toronto.edu/~tcc/darabiha_jssc08.pdf "Power Reduction Techniques for LDPC Decoders"]
</ref> Compared to randomly generated LDPC codes, structured LDPC codes—such as the LDPC code used in the [[DVB-S2]] standard—can have simpler and therefore lower-cost hardware—in particular, codes constructed such that the H matrix is a [[circulant matrix]].<ref>
{{cite journal |
</ref>
Yet another way of constructing LDPC codes is to use [[finite geometry|finite geometries]]. This method was proposed by Y. Kou ''et al.'' in 2001.<ref name=Kou1>{{cite journal |
== LDPC codes vs. turbo codes ==
LDPC codes can be compared with other powerful coding schemes, e.g. [[turbo code]]s.<ref>{{cite conference |
== See also ==
Line 275:
* [http://bernh.net/media/download/papers/ldpc.pdf LDPC Codes – a brief Tutorial (by Bernhard Leiner, 2005)]
* [https://www.nt.tuwien.ac.at/wp-content/uploads/2016/10/DC2-16_Ch7_LDPC.pdf LDPC Codes (TU Wien)] {{Webarchive|url=https://web.archive.org/web/20190228003946/https://www.nt.tuwien.ac.at/wp-content/uploads/2016/10/DC2-16_Ch7_LDPC.pdf |date=February 28, 2019 }}
* {{cite book |url=http://www.inference.phy.cam.ac.uk/mackay/itila/ |title=Information Theory, Inference, and Learning Algorithms |author-link=David J.C. MacKay |first=David J.C. |last=MacKay |publisher=Cambridge University Press |isbn=9780521642989 |date= September 25, 2003|pages=557–573 |chapter=47. Low-Density Parity-Check Codes |chapter-url={{GBurl|AKuMj4PN_EMC|p=557}} }}
* {{cite
* [http://www.ics.uci.edu/~welling/teaching/ICS279/LPCD.pdf LDPC Codes: An Introduction (by Amin Shokrollahi, 2003)]
* [http://www.tsc.uc3m.es/~fernando/BP_LDPC.pdf Belief-Propagation Decoding of LDPC Codes (by Amir Bennatan, Princeton University)]
|