Content deleted Content added
→Randomized algorithm: TeX correction |
→Randomized algorithm: more of those.......... |
||
Line 21:
# Run errors and erasure algorithm for <math>C_\text{out}</math> on <math>\mathbf{y}^{\prime\prime} = (y_1^{\prime\prime}, \ldots, y_N^{\prime\prime})</math>.
'''Theorem 1.''' ''Let y be a received word such that there exists a [[codeword]]'' <math>\mathbf{c} = (c_1,\ldots, c_N) \in C_\text{out}\circ{C_\text{in}} \subseteq [q^n]^N</math> ''such that'' <math>\Delta(\mathbf{c}, \mathbf{y})
Note that a [[Concatenated_codes|naive decoding algorithm for concatenated codes]] can correct up to <math>Dd \over 4</math> errors.
'''Lemma 1.''' ''Let the assumption in Theorem 1 hold. And if'' <math>\mathbf{y^{\prime\prime}}</math> ''has'' <math>e'</math> ''errors and'' <math>s'</math> ''erasures(when compared with'' <math>\mathbf{c}</math>'') after'' '''Step 1''', ''then'' <math>\mathbb{E}[2e' + s']
If <math>2e' + s'
'''Proof of lemma 1.''' For every <math>1 \le i \le N</math>, define <math>e_i = \Delta(y_i, c_i)</math>. This implies that
Line 49:
: <math>\mathbb{E}[2X{_i^e + X{_i^?}}] \le {{2e_i} \over d}\qquad\qquad (2)</math>
Clearly, by definition <math>e' = \sum_{i}X{_i^e}</math> and <math>s' = \sum_{i}X{_i^?}</math>. Futher, by the [http://en.wikipedia.org/wiki/Linear linearity] of expectation, we get <math>\mathbb{E}[2e' + s'] \le {2 \over d}\sum_ie_i
'''Case 1:''' <math>(c_i = C_\text{in}(y_i'))</math>
|