Content deleted Content added
No edit summary |
|||
Line 1:
==Introduction==
In [[coding theory]], Generalized Minimum Distance (GMD) Decoding provides an efficient [[algorithm]] for decoding [[concatenated code]]s, which is based on using an [[error|errors]]-and-[[Erasure_code|erasures]] [[decoder]] for the [[outer code]].
A [[Decoding_methods|naive decoding algorithm]] for concatenated codes can not be an optimal way of decoding because it does not take into account the information that [[Maximum_likelihood_decoding|Maximum Likelihood Decoding (MLD)]] gives. In other words, in the naive algorithm, inner received [[codeword]]s are treated the same regardless of the difference between their [[hamming distance]]s. Intuitively, the outer decoder should place higher confidence in symbols whose inner [[code|encodings]] are close to the received word. [[David Forney]] in 1966 devised a better algorithm called Generalized Minimum Distance (GMD) Decoding which makes use of those information better. This method is achieved by measuring confidence of each received codeword, and erasing symbols whose confidence is below a desired value. And GMD decoding algorithm was one of the first examples of [[Soft-decision_decoder|soft decision decoders]]. We will present three versions of the GMD decoding algorithm. The first two will be [[randomized algorithm]]s while the last one will be a [[deterministic algorithm]].
|