Content deleted Content added
JamietwBot (talk | contribs) Marked as dead end (Deadendmarker is an automated bot.) |
Katharineamy (talk | contribs) Adding links |
||
Line 1:
==Introduction==
In [[coding theory]], Generalized Minimum Distance (GMD) Decoding provides an efficient [[algorithm]] for decoding [
A [
==Setup==
# Hamming distance : Given two
# Minimum distance : Let <math>C\subseteq\sum^n</math> be a code. The minimum distance of code C is defined to be <math>d = \min{\Delta(c_1, c_2)}</math> where <math>c_1 \ne c_2 \in C</math>
# Code concatenation : Given <math>m = (m_{1}, ..., m_{K}) \in [Q]^K</math>, consider two codes which we call outer code and inner code <math>C_{out} = [Q]K \rightarrow [Q]N, C_{in} : [q]k \rightarrow [q]n</math>, and their distances are <math>D</math> and <math>d</math>. A concatenated code can be achieved by <math>C_{out} \circ C_{in} (m) = (C_{in} (C_{out} (m)_1), \ldots, C_{in} (C_{out} (m)_N ))</math> where <math>C_{out}(m) = ((C_{out} (m)_1, \ldots, (m)_N ))</math>. Finally we will take <math>C_{out}</math> to be RS code, which has an errors and erasure decoder, and <math>K = O(\log{N})</math>, which in turn implies that MLD on the inner code will be poly(<math>N</math>) time.
# Maximum likelihood decoding(MLD) : MLD is a decoding method for error correcting codes, which outputs the codeword closest to the received word in Hamming distance. The MLD function denoted by <math>D_{MLD} : \sum^n \rightarrow C</math> is defined as follows. For every <math>y\in\sum_n</math>, <math>D_{MLD}(y) = \arg \min_{c \in C}\Delta(c, y)</math>.
# [[Probability density function]] : A [[probability distribution]] <math>\Pr[\bullet]</math> on a sample space <math>S</math> is a mapping from events of <math>S</math> to [[real
# [[Expected value]] : The expected value of a [[discrete random variable]] <math>X</math> is <math>\mathbb{E} = \sum_x\Pr[X = x]</math>.
==Randomized algorithm==
Consider the received word <math>\mathbf{y} = (y_1,...,y_N) \in [q^n]^N</math> which corrupted by [
'''Randomized_Decoder'''
Line 23 ⟶ 22:
# Run errors and erasure algorithm for <math>C_{out}</math> on <math>\mathbf{y}^{\prime\prime} = (y_1^{\prime\prime}, \ldots, y_N^{\prime\prime})</math>.
'''Theorem 1.''' ''Let y be a received word such that there exists a [
Note that a [
'''Lemma 1.''' ''Let the assumption in Theorem 1 hold. And if'' <math>\mathbf{y^{\prime\prime}}</math> ''has'' <math>e'</math> ''errors and'' <math>s'</math> ''erasures(when compared with'' <math>\mathbf{c}</math>'') after'' '''Step 1''', ''then'' <math>\mathbb{E}[2e' + s']</math> < <math>D</math>.
Line 72 ⟶ 71:
==Modified randomized algorithm==
Note that, in the previous version of the GMD algorithm in step "3", we do not really need to use "fresh" [
'''Modified_Randomized_Decoder'''
Line 81 ⟶ 80:
# Run errors and erasure algorithm for <math>C_{out}</math> on <math>\mathbf{y}^{\prime\prime} = (y_1^{\prime\prime},..., y_N^{\prime\prime})</math>.
For the proof of '''[[Lemma (mathematics)|Lemma 1]]''', we only use the randomness to show that
<math>Pr[y_i^{\prime\prime} = ?] = {2\omega_i \over d}</math>.
Line 107 ⟶ 106:
# Among all the <math>c_\theta</math> output in 4, output the one closest to <math>\mathbf{y}</math>
Every loop of 1~4 can be run in [
Specifically, each call to an errors and erasures decoder of < <math>dD/2</math> errors takes <math>O(d)</math> time. Finally, the runtime of the algorithm above is <math>O(NQn^{O(1)} + NT_{out})</math> where <math>T_{out}</math> is the running time of the outer errors and erasures decoder.
Line 120 ⟶ 119:
#[http://www.cs.washington.edu/education/courses/cse533/06au University of Washington - Venkatesan Guruswami]
#G. David Forney. Generalized Minimum Distance decoding. ''IEEE Transactions on Information Theory'', 12:125-131, 1966
{{uncat}}
|