Content deleted Content added
m →Lower and upper bounds of block codes: \left\right |
m unpiped links using script |
||
(47 intermediate revisions by 31 users not shown) | |||
Line 1:
{{More footnotes needed|date=February 2025}}
There is a vast number of examples for block codes, many of which have a wide range of practical applications. Block codes are conceptually useful because they allow coding theorists, [[mathematics|mathematicians]], and [[computer science|computer scientists]] to study the limitations of ''all'' block codes in a unified way.▼
In [[coding theory]], '''block codes''' are a large and important family of [[Channel coding|error-correcting codes]] that encode data in blocks.
▲There is a vast number of examples for block codes, many of which have a wide range of practical applications.
Such limitations often take the form of ''bounds'' that relate different parameters of the block code to each other, such as its rate and its ability to detect and correct errors.
Examples of block codes are [[Reed–Solomon code]]s, [[Hamming code]]s, [[Hadamard code]]s, [[Expander code]]s, [[Golay code (disambiguation)|Golay code]]s, [[Reed–Muller code]]s and [[
Algebraic block codes are typically [[Soft-decision decoder|hard-decoded]] using algebraic decoders.{{Technical statement|date=May 2015}}
The term ''block code'' may also refer to any error-correcting code that acts on a block of
This article deals with "algebraic block codes".
Line 43 ⟶ 46:
Then the minimum distance <math>d</math> of the code <math>C</math> is defined as
:<math>d := \min_{m_1,m_2\in\Sigma^k\atop m_1\neq m_2} \Delta[C(m_1),C(m_2)]</math>.
Since any code has to be [[injective]], any two codewords will disagree in at least one position, so the distance of any code is at least <math>1</math>. Besides, the '''distance''' equals the '''[[Hamming weight#Minimum weight|minimum weight]]''' for linear block codes because:{{cn|date=December 2024}}
:<math>\min_{m_1,m_2\in\Sigma^k\atop m_1\neq m_2} \Delta[C(m_1),C(m_2)] = \min_{m_1,m_2\in\Sigma^k\atop m_1\neq m_2} \Delta[\mathbf{0},C(
A larger distance allows for more error correction and detection.
Line 67 ⟶ 70:
== Error detection and correction properties ==
A codeword <math>c \in \Sigma^n</math>could be considered as a point in the <math>n</math>-dimension space <math>\Sigma^n</math> and the code <math>\mathcal{C}</math> is the subset of <math>\Sigma^n</math>. A code <math>\mathcal{C}</math> has distance <math>d</math> means that <math>\forall c\in \mathcal{C}</math>, there is no other codeword in the
* <math> \mathcal{C}</math> can detect <math>d-1</math> errors : Because a codeword <math>c</math> is the only codeword in the Hamming ball centered at itself with radius <math>d-1</math>, no error pattern of <math>d-1</math> or fewer errors could change one codeword to another. When the receiver detects that the received vector is not a codeword of <math> \mathcal{C}</math>, the errors are detected (but no guarantee to correct).
* <math> \mathcal{C}</math> can correct <math>\textstyle\left\lfloor {{d-1} \over 2}\right\rfloor</math> errors. Because a codeword <math>c</math> is the only codeword in the Hamming ball centered at itself with radius <math>d-1</math>, the two Hamming balls centered at two different codewords respectively with both radius <math>\textstyle\left \lfloor {{d-1} \over 2}\right \rfloor</math> do not overlap with each other. Therefore, if we consider the error correction as finding the codeword closest to the received word <math>y</math>, as long as the number of errors is no more than <math>\textstyle\left \lfloor {{d-1} \over 2}\right \rfloor</math>, there is only one codeword in the hamming ball centered at <math>y</math> with radius <math>\textstyle\left \lfloor {{d-1} \over 2}\right \rfloor</math>, therefore all errors could be corrected.
Line 74 ⟶ 77:
== Lower and upper bounds of block codes ==
[[File:HammingLimit.png|thumb|720px|Hamming limit{{clarify|reason='Base' from y-axis legend does not occur in this article's textual content.|date=January 2022}}]]
[[File:Linear Binary Block Codes and their needed Check Symbols.png|thumb|720px|
There are theoretical limits (such as the Hamming limit), but another question is which codes can actually constructed.{{clarify|reason='Base' from y-axis legend does not occur in this article's textual content.|date=January 2022}} It is like [[Sphere packing|packing spheres in a box]] in many dimensions. This diagram shows the constructible codes, which are linear and binary. The ''x'' axis shows the number of protected symbols ''k'', the ''y'' axis the number of needed check symbols ''n–k''. Plotted are the limits for different Hamming distances from 1 (unprotected) to 34.
Marked with dots are perfect codes:
{{bulleted list
Line 96 ⟶ 99:
To explore the relationship between <math>R(C)</math> and <math>\delta(C)</math>, a set of lower and upper bounds of block codes are known.
===
{{main article|Hamming bound}}
: <math> R \le 1- {1 \over n} \cdot \log_{q} \cdot \left[\sum_{i=0}^{\left\lfloor {{\delta \cdot n-1}\over 2}\right\rfloor}\binom{n}{i}(q-1)^i\right]</math>
===
{{main article|Singleton bound}}
The Singleton bound is that the sum of the rate and the relative distance of a block code cannot be much larger than 1:
:<math> R + \delta \le 1+\frac{1}{n}</math>.
Line 105 ⟶ 110:
[[Reed–Solomon code]]s are non-trivial examples of codes that satisfy the singleton bound with equality.
===
{{main article|Plotkin bound}}
For <math>q=2</math>, <math>R+2\delta\le1</math>. In other words, <math>k + 2d \le n</math>.
Line 115 ⟶ 121:
For any {{mvar|q}}-ary code with distance <math>\delta</math>, <math>R \le 1- \left({q \over {q-1}}\right) \delta + o\left(1\right)</math>
===
{{main article|Gilbert–Varshamov bound}}
<math>R\ge1-H_q\left(\delta\right)-\epsilon</math>, where <math>0 \le \delta \le 1-{1\over q}, 0\le \epsilon \le 1- H_q\left(\delta\right)</math>,
<math> H_q\left(x\right) ~\overset{\underset{\mathrm{def}}{}}{=}~ -x\cdot\log_q{x \over {q-1}}-\left(1-x\right)\cdot\log_q{\left(1-x\right)} </math> is the {{mvar|q}}-ary entropy function.
===
{{main article|Johnson bound}}
Define <math>J_q\left(\delta\right) ~\overset{\underset{\mathrm{def}}{}}{=}~ \left(1-{1\over q}\right)\left(1-\sqrt{1-{q \delta \over{q-1}}}\right) </math>. <br />
Let <math>J_q\left(n, d, e\right)</math> be the maximum number of codewords in a Hamming ball of radius {{mvar|e}} for any code <math>C \subseteq \mathbb{F}_q^n</math> of distance {{mvar|d}}.
Line 125 ⟶ 133:
Then we have the ''Johnson Bound'' : <math>J_q\left(n,d,e\right)\le qnd</math>, if <math>{e \over n} \le {{q-1}\over q}\left( {1-\sqrt{1-{q \over{q-1}}\cdot{d \over n}}}\, \right)=J_q\left({d \over n}\right)</math>
===
{{main article|Elias Bassalygo bound}}
: <math>R={\log_q{|C|} \over n} \le 1-H_q\left(J_q\left(\delta\right)\right)+o\left(1\right) </math>
Line 157 ⟶ 165:
* [[Shannon–Hartley theorem]]
* [[Noisy channel]]
* [[List decoding]]<ref name="schlegel" />
* [[Sphere packing]]
Line 163 ⟶ 171:
{{reflist}}
{{refbegin}}
* {{cite book | author=J.H. van Lint | authorlink=Jack van Lint | title=Introduction to Coding Theory | edition=2nd | publisher=Springer-Verlag | series=[[Graduate Texts in Mathematics|GTM]] | volume=86 | year=1992 | isbn=3-540-54894-7 | page=[https://archive.org/details/introductiontoco0000lint/page/31 31] | url=https://archive.org/details/introductiontoco0000lint/page/31 }}
* {{cite book | author=F.J. MacWilliams | authorlink=Jessie MacWilliams |author2=N.J.A. Sloane |authorlink2=Neil Sloane | title=The Theory of Error-Correcting Codes | url=https://archive.org/details/theoryoferrorcor0000macw | url-access=registration | publisher=North-Holland | year=1977 | isbn=0-444-85193-3 | page=[https://archive.org/details/theoryoferrorcor0000macw/page/35 35]}}▼
* {{cite book | author=
▲* {{cite book | author=F.J. MacWilliams | authorlink=Jessie MacWilliams |author2=N.J.A. Sloane |authorlink2=Neil Sloane | title=The Theory of Error-Correcting Codes | publisher=North-Holland | year=1977 | isbn=0-444-85193-3 | page=35}}
* {{cite book | author=S. Lin |author2=D. J. Jr. Costello | title= Error Control Coding: Fundamentals and Applications | publisher=Prentice-Hall | year=1983 | isbn=0-13-283796-X}}
{{refend}}
== External links ==
|