Content deleted Content added
Citation bot (talk | contribs) Altered template type. Add: pages, issue, volume, doi, date, journal, authors 1-3. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox | #UCB_webform_linked 1022/1522 |
|||
Line 5:
'''Low-density parity-check (LDPC)''' codes are a class of [[error correction code]]s which (together with the closely-related [[turbo code]]s) have gained prominence in [[coding theory]] and [[information theory]] since the late 1990s. The codes today are widely used in applications ranging from wireless communications to flash-memory storage. Together with turbo codes, they sparked a revolution in coding theory, achieving order-of-magnitude improvements in performance compared to traditional error correction codes.<ref>{{Cite web |title=Turbo Codes Explained: History, Examples, and Applications - IEEE Spectrum |url=https://spectrum.ieee.org/turbo-codes |access-date=2024-12-18 |website=spectrum.ieee.org |language=en}}</ref>
Central to the performance of LDPC codes is their adaptability to the iterative [[belief propagation]] decoding algorithm. Under this algorithm, they can be designed to approach theoretical limits ([[Channel capacity|capacities]]) of many channels<ref>{{Cite
Theoretically, analysis of LDPC codes focuses on sequences of codes of fixed [[code rate]] and increasing [[block length]]. These sequences are typically tailored to a set of channels. For appropriately designed sequences, the decoding error under belief propagation can often be proven to be vanishingly small (approaches zero with the block length) at rates that are very close to the capacities of the channels. Furthermore, this can be achieved at a complexity that is linear in the block length.
Line 16:
Renewed interest in the codes emerged following the invention of the closely-related [[turbo code]]s (1993), whose similarly iterative decoding algorithm outperformed other codes used at that time. LDPC codes were subsequently rediscovered in 1996.<ref name="MacKay96">[[David J.C. MacKay]] and Radford M. Neal, "Near Shannon Limit Performance of Low Density Parity Check Codes," Electronics Letters, July 1996</ref> Initial industry preference for LDPC codes over turbo codes stemmed from patent-related constraints on the latter.<ref name="Closing">{{cite journal |author=Erico Guizzo |date=Mar 1, 2004 |title=CLOSING IN ON THE PERFECT CODE |url=https://spectrum.ieee.org/closing-in-on-the-perfect-code |url-status=dead |journal=IEEE Spectrum |archive-url=https://web.archive.org/web/20210902170851/https://spectrum.ieee.org/closing-in-on-the-perfect-code |archive-date=September 2, 2021}} "Another advantage, perhaps the biggest of all, is that the LDPC patents have expired, so companies can use them without having to pay for intellectual-property rights."</ref> Over the time that has elapsed since their discovery, advances in LDPC codes have seen them surpass turbo codes in terms of [[error floor]] and performance in the higher [[code rate]] range, leaving turbo codes better suited for the lower code rates only.<ref>[http://deepspace.jpl.nasa.gov/dsndocs/810-005/208/208B.pdf Telemetry Data Decoding, Design Handbook]</ref> Although the fundamental patent for turbo codes has expired (on August 29, 2013),<ref>{{cite patent|country=US|number=5446747|url=https://www.google.com/patents/US5446747}}</ref><ref>{{cite journal |last=Mackenzie |first=D. |date=9 July 2005 |title=Communication speed nears terminal velocity |journal=New Scientist}}</ref> LDPC codes are now still being preferred for their technical merits.
Theoretical interest in LDPC codes also follows from their amenability to mathematical analysis. In his dissertation, Gallager showed that LDPC codes achieve the [[Gilbert–Varshamov bound for linear codes]] over binary fields with high probability. Over the [[binary erasure channel]], code sequences were designed at rates arbitrary close to channel capacity, with provably vanishing decoding error probability and linear decoding complexity.<ref>{{Cite
==Applications==
|