Linear code: Difference between revisions

Content deleted Content added
No edit summary
No edit summary
Line 12:
 
== Generator and check matrices ==
As a [[linear subspace]] of <math>\mathbb{F}_q^n</math>, the entire code ''C'' (which may be very large) may be represented as the [[span (linear algebra)|span]] of a set of <math>k</math> codewords (known as a [[basis (linear algebra)|basis]] in [[linear algebra]]). These basis codewords are often collated in the rows of a matrix G known as a '''[[Generator matrix|generating matrix]]''' for the code ''C''. When G has the block matrix form <math>\boldsymbol{G} = [I_k |\mid P]</math>, where <math>I_k</math> denotes the <math>k \times k</math> identity matrix and P is some <math>k \times (n-k)</math> matrix, then we say G is in '''standard form'''.
 
A matrix ''H'' representing a linear function <math>\phi : \mathbb{F}_q^n\to \mathbb{F}_q^{n-k}</math> whose [[Kernel (matrix)|kernel]] is ''C'' is called a '''[[check matrix]]''' of ''C'' (or sometimes a parity check matrix). Equivalently, ''H'' is a matrix whose [[null space]] is ''C''. If ''C'' is a code with a generating matrix ''G'' in standard form, <math>\boldsymbol{G} = [I_k |\mid P]</math>, then <math>\boldsymbol{H} = [-P^{T} |\mid I_{n-k} ]</math> is a check matrix for C. The code generated by ''H'' is called the '''dual code''' of C. It can be verified that G is a <math>k \times n</math> matrix, while H is a <math>(n-k) \times n</math> matrix.
 
Linearity guarantees that the minimum [[Hamming distance]] ''d'' between a codeword ''c''<sub>0</sub> and any of the other codewords ''c''&nbsp;≠&nbsp;''c''<sub>0</sub> is independent of ''c''<sub>0</sub>. This follows from the property that the difference ''c''&nbsp;&minus;&nbsp;''c''<sub>0</sub> of two codewords in ''C'' is also a codeword (i.e., an [[element (mathematics)|element]] of the subspace ''C''), and the property that ''d''(''c'',&nbsp;c<sub>0</sub>)&nbsp;=&nbsp;''d''(''c''&nbsp;&minus;&nbsp;''c''<sub>0</sub>,&nbsp;0). These properties imply that
Line 24:
The distance ''d'' of a linear code ''C'' also equals the minimum number of linearly dependent columns of the check matrix ''H''.
 
''Proof:'' Because <math>\boldsymbol{H} \cdot \boldsymbol{c}^T = \boldsymbol{0}</math>, which is equivalent to <math>\sum_{i=1}^n (c_i \cdot \boldsymbol{H_i}) = \boldsymbol{0}</math>, where <math>\boldsymbol{H_i}</math> is the <math>i^{th}</math> column of <math>\boldsymbol{H}</math>. Remove those items with <math>c_i=0</math>, those <math>\boldsymbol{H_i}</math> with <math>c_i \neq 0</math> are linearly dependent. Therefore, <math>d</math> is at least the minimum number of linearly dependent columns. On another hand, consider the minimum set of linearly dependent columns <math>\{ \boldsymbol{H_j} |\mid j \in S \}</math> where <math>S</math> is the column index set. <math>\sum_{i=1}^n (c_i \cdot \boldsymbol{H_i}) = \sum_{j \in S} (c_j \cdot \boldsymbol{H_j}) + \sum_{j \notin S} (c_j \cdot \boldsymbol{H_j}) = \boldsymbol{0}</math>. Now consider the vector <math>\boldsymbol{c'}</math> such that <math>c_j'=0</math> if <math>j \notin S</math>. Note <math>\boldsymbol{c'} \in C</math> because <math>\boldsymbol{H} \cdot \boldsymbol{c'}^T = \boldsymbol{0}</math> . Therefore, we have <math>d \le wt(\boldsymbol{c'}) </math>, which is the minimum number of linearly dependent columns in <math>\boldsymbol{H}</math>. The claimed property is therefore proven.
 
== Example: Hamming codes ==
Line 34:
'''Example :''' The linear block code with the following generator matrix and parity check matrix is a <math> [7,4,3]_2</math> Hamming code.
 
: <math>\boldsymbol{G}=\begin{pmatrix} 1\& 0\& 0\& 0\& 1\& 1\& 0 \\ 0\& 1\& 0\& 0\& 0\& 1\& 1 \\ 0\& 0\& 1\& 0\& 1\& 1\& 1 \\ 0\& 0\& 0\& 1\& 1\& 0\& 1 \end{pmatrix} ,</math> <math>\boldsymbol{H}=\begin{pmatrix} 1\& 0\& 1\& 1\& 1\& 0\& 0 \\ 1\& 1&\ 1\& 0\& 0\& 1\& 0 \\ 0\& 1\& 1\& 1\& 0\& 0\& 1 \end{pmatrix}</math>
 
== Example: Hadamard codes ==
Line 43:
 
'''Example:''' The linear block code with the following generator matrix is a <math> [8,3,4]_2</math> Hadamard code:
<math>\boldsymbol{G}_\mathrm{Had}=\begin{pmatrix} 0\& 0\& 0\& 0\& 1&\ 1\ &1\& 1\\ 0\& 0\& 1\& 1\& 0\& 0\& 1\& 1\\ 0\& 1\& 0\& 1\& 0\& 1\& 0\ & 1\end{pmatrix}</math>.
 
[[Hadamard code]] is a special case of [[Reed–Muller code]]. If we take the first column (the all-zero column) out from <math>\boldsymbol{G}_\mathrm{Had}</math>, we get <math>[7,3,4]_2</math> ''simplex code'', which is the ''dual code '' of Hamming code.
 
==Nearest neighbor algorithm==
Line 51:
The parameter d is closely related to the error correcting ability of the code. The following construction/algorithm illustrates this (called the nearest neighbor decoding algorithm):
 
Input: A ''received vector'' v in <math>\mathbb{F}_q^n.</math> .
 
Output: A codeword <math>w</math> in <math>C</math> closest to <math>v</math>, if any.
Line 74:
A code C whose parameters satisfy k+d=n+1 is called '''maximum distance separable''' or '''MDS'''. Such codes, when they exist, are in some sense best possible.
 
If ''C''<sub>1</sub> and ''C''<sub>2</sub> are two codes of length ''n'' and if there is a permutation ''p'' in the [[symmetric group]] ''S''<sub>''n''</sub> for which (''c''<sub>1</sub>,...,''c''<sub>''n''</sub>) in C<sub>1</sub> if and only if (''c''<sub>''p''(1)</sub>,...,''c''<sub>''p''(''n'')</sub>) in ''C''<sub>2</sub>, then we say ''C''<sub>1</sub> and ''C''<sub>2</sub> are '''permutation equivalent'''. In more generality, if there is an <math>n\times n</math> [[monomial matrix]] <math>M\colon \mathbb{F}_q^n \to \mathbb{F}_q^n</math> which sends ''C''<sub>1</sub> isomorphically to ''C''<sub>2</sub> then we say ''C''<sub>1</sub> and ''C''<sub>2</sub> are '''equivalent'''.
 
''Lemma'': Any linear code is permutation equivalent to a code which is in standard form.
Line 84:
Some examples of linear codes include:
{{div col|colwidth=27em}}
* [[Repetition code]]s
* [[Parity code]]s
* [[Cyclic code]]s
* [[Hamming code]]s
* [[Golay code (disambiguation)|Golay code]], both the [[binary Golay code|binary]] and [[ternary Golay code|ternary]] versions
* [[Polynomial code]]s, of which [[BCH code]]s are an example
* [[Reed–Solomon error correction|Reed–Solomon codes]]
* [[Reed–Muller code]]s
* [[Algebraic geometry code]]s
* [[Binary Goppa code]]s
* [[Low-density parity-check codes]]
* [[Expander code]]s
* [[Multidimensional parity-check code]]s
* [[Toric code]]s
* [[Turbo code]]s
* [[Locally recoverable code]]
{{div col end}}
 
==Generalization==
[[Hamming space]]s over non-field alphabets have also been considered, especially over [[finite ring]]s, most notably [[Galois ring]]s over [[modular arithmetic|'''Z'''<sub>4</sub>]]. This gives rise to [[module (mathematics)|module]]s instead of vector spaces and [[ring-linear code]]s (identified with [[submodule]]s) instead of linear codes. The typical metric used in this case the [[Lee distance]]. There exist a [[Gray isometry]] between <math>\mathbb{Z}_2^{2m}</math> (i.e. GF(2<sup>2m2''m''</sup>)) with the Hamming distance and <math>\mathbb{Z}_4^m</math> (also denoted as GR(4,m)) with the Lee distance; its main attraction is that it establishes a correspondence between some "good" codes that are not linear over <math>\mathbb{Z}_2^{2m}</math> as images of ring-linear codes from <math>\mathbb{Z}_4^m</math>.<ref name="Greferath2009">{{cite book |editor=Massimiliano Sala |editor2=Teo Mora |editor3=Ludovic Perret |editor4=Shojiro Sakata |editor5=Carlo Traverso|title=Gröbner Bases, Coding, and Cryptography|year=2009|publisher=Springer Science & Business Media|isbn=978-3-540-93806-4|chapter=An Introduction to Ring-Linear Coding Theory|author=Marcus Greferath}}</ref><ref>{{Cite web|url=https://www.encyclopediaofmath.org/index.php/Main_Page|title=Encyclopedia of Mathematics|website=www.encyclopediaofmath.org}}</ref><ref name="Lint1999">{{cite book|author=J.H. van Lint|title=Introduction to Coding Theory|url=https://archive.org/details/introductiontoco0000lint_a3b9|url-access=registration|year=1999|publisher=Springer|isbn=978-3-540-64133-9|edition=3rd|at=Chapter 8: Codes over {{not a typo|ℤ}}<sub>4</sub>}}</ref>
 
Some authors have referred to such codes over rings simply as linear codes as well.<ref name="DoughertyFacchini2015">{{cite book |editor=Steven Dougherty |editor2=Alberto Facchini |editor3=Andre Gerard Leroy |editor4=Edmund Puczylowski |editor5=Patrick Sole|title=Noncommutative Rings and Their Applications|chapter-url=https://books.google.com/books?id=psrXBgAAQBAJ&pg=PA80|year=2015|publisher=American Mathematical Soc.|isbn=978-1-4704-1032-2|page=80|chapter=Open Problems in Coding Theory|author=S.T. Dougherty |author2=J.-L. Kim |author3=P. Sole}}</ref>