Decoding methods: Difference between revisions

Content deleted Content added
Undid revision 832891003 by Special:Contributions/2403:6200:8882:7498:423:2B93:6968:FC96. vandalism
m Eupper case preferred?! E.g. for "standard array decoding"
Line 7:
 
==Ideal observer decoding==
 
One may be given the message <math>x \in \mathbb{F}_2^n</math>, then '''ideal observer decoding''' generates the codeword <math>y \in C</math>. The process results in this solution:
 
Line 14 ⟶ 13:
For example, a person can choose the codeword <math>y</math> that is most likely to be received as the message <math>x</math> after transmission.
 
=== Decoding conventions ===
Each codeword does not have an expected possibility: there may be more than one codeword with an equal likelihood of mutating into the received message. In such a case, the sender and receiver(s) must agree ahead of time on a decoding convention. Popular conventions include:
 
:# Request that the codeword be resent --{{snd}} [[automatic repeat-request]].
Each codeword does not have an expected possibility: there may be more than one codeword with an equal likelihood of mutating into the received message. In such a case, the sender and receiver(s) must agree ahead of time on a decoding convention. Popular conventions include:
:# Request that the codeword be resent -- [[automatic repeat-request]]
:# Choose any random codeword from the set of most likely codewords which is nearer to that.
:# If [[Concatenated error correction code|another code follows]], mark the ambiguous bits of the codeword as erasures and hope that the outer code disambiguates them.
 
==Maximum likelihood decoding==
Line 57 ⟶ 56:
 
==Minimum distance decoding==
Given a received codeword <math>x \in \mathbb{F}_2^n</math>, '''minimum distance decoding''' picks a codeword <math>y \in C</math> to minimise the [[Hamming distance]] :
 
Given a received codeword <math>x \in \mathbb{F}_2^n</math>, '''minimum distance decoding''' picks a codeword <math>y \in C</math> to minimise the [[Hamming distance]] :
 
:<math>d(x,y) = \# \{i : x_i \not = y_i \}</math>
Line 81 ⟶ 79:
Minimum distance decoding is also known as ''nearest neighbour decoding''. It can be assisted or automated by using a [[standard array]]. Minimum distance decoding is a reasonable decoding method when the following conditions are met:
 
:#The probability <math>p</math> that an error occurs is independent of the position of the symbol.
:#Errors are independent events -{{snd}} an error at one position in the message does not affect other positions.
 
These assumptions may be reasonable for transmissions over a [[binary symmetric channel]]. They may be unreasonable for other media, such as a DVD, where a single scratch on the disk can cause an error in many neighbouring symbols or codewords.
Line 111 ⟶ 109:
:<math>Hz = H(x+e) =Hx + He = 0 + He = He</math>
 
To perform [[#Maximum_likelihood_decoding|ML decoding]] in a [[Binarybinary symmetric channel]], one has to look-up a precomputed table of size <math>2^{n-k}</math>, mapping <math>He</math> to <math>e</math>.<br/>
 
Note that this is already of significantly less complexity than that of a [[standard array |Standardstandard array decoding]].
 
However, under the assumption that no more than <math>t</math> errors were made during transmission, the receiver can look up the value <math>He</math> in a further reduced table of size
Line 128 ⟶ 127:
:<math>x = z - e </math>
 
== Partial response maximum likelihood ==
{{Main|PRML}}
 
Partial response maximum likelihood ([[PRML]]) is a method for converting the weak analog signal from the head of a magnetic disk or tape drive into a digital signal.
 
== Viterbi decoder ==
{{Main|Viterbi decoder}}
 
A Viterbi decoder uses the Viterbi algorithm for decoding a bitstream that has been encoded using [[forward error correction]] based on a convolutional code.
The [[Hamming distance]] is used as a metric for hard decision Viterbi decoders. The ''squared'' [[Euclidean distance]] is used as a metric for soft decision decoders.
The ''squared'' [[Euclidean distance]] is used as a metric for soft decision decoders.
 
== See also ==
* [[Error detection and correction]]