Generalized minimum-distance decoding: Difference between revisions

Content deleted Content added
m Disambiguated: vectorEuclidean vector
Sidjaggi (talk | contribs)
m Corrected small math typos
Line 23:
# Run errors and erasure algorithm for <math>C_\text{out}</math> on <math>\mathbf{y}^{\prime\prime} = (y_1^{\prime\prime}, \ldots, y_N^{\prime\prime})</math>.
 
'''Theorem 1.''' ''Let y be a received word such that there exists a [[codeword]]'' <math>\mathbf{c} = (c_1,\ldots, c_N) \in C_\text{out}\circ{C_\text{in}} \subseteq [q^n]^N</math> ''such that'' <math>\Delta(\mathbf{c}, \mathbf{y}) < Dd \over frac{Dd}{2}</math>. ''Then the deterministic GMD algorithm outputs'' <math>\mathbf{c}</math>.
 
Note that a [[Concatenated_codes|naive decoding algorithm for concatenated codes]] can correct up to <math>Dd \over 4</math> errors.
Line 33:
'''Proof of lemma 1.''' For every <math>1 \le i \le N</math>, define <math>e_i = \Delta(y_i, c_i)</math>. This implies that
 
<math>\sum_{i=1}^N e_i < Dd \over2frac{Dd}{2} \qquad\qquad (1)</math>
 
Next for every <math>1 \le i \le N</math>, we define two [[Indicator variable|indicator variables]]: