Convolutional code: Difference between revisions

Content deleted Content added
m Included free distance of unpunctured code
Marudubshinki (talk | contribs)
m Robot: converting/fixing HTML
Line 1:
In [[telecommunication]], a '''convolutional code''' is a type of [[error-correcting code]] in which (a) each <i>''m</i>''-[[bit]] [[information]] symbol (each <i>''m</i>''-[[bit string]]) to be encoded is transformed into an <i>''n</i>''-bit symbol, where <i>''m/n</i>'' is the code <i>''rate</i>'' (<i>''n</i>'' >= <i>''m</i>'') and (b) the transformation is a function of the last <i>''k</i>'' information symbols, where <i>''k</i>'' is the constraint length of the code.
 
===Where Convolutional Codes are used===
Line 5:
 
===Convolutional Encoding===
To convolutionally encode data, start with <i>''k</i>'' memory registers, each holding 1 input bit. Unless otherwise specified, all memory registers start with a value of 0. The encoder has <i>''n</i>'' modulo-2 adders, and <i>''n</i>'' [[generator polynomial]]s -- one for each adder (see figure below). An input bit <i>''m<sub>1</sub></i>'' is fed into the leftmost register. Using the generator polynomials and the existing values in the remaining registers, the encoder outputs <i>''n</i>'' bits. Now bit shift all register values to the right (<i>''m<sub>1</sub></i>'' moves to <i>''m<sub>0</sub></i>'', <i>''m<sub>0</sub></i>'' moves to <i>''m<sub>-1</sub></i>'') and wait for the next input bit. If there are no remaining input bits, the encoder continues output until all registers have returned to the zero state.
 
The figure below is a rate 1/3 (<i>''m/n</i>'') encoder with constraint length (<i>''k</i>'') of 3. Generator polynomials are G<sub>1</sub> = (1,1,1), G<sub>2</sub> = (0,1,1), and G<sub>3</sub> = (1,0,1). Therefore, output bits are calculated as follows:
 
<i>''n<sub>1</sub></i>'' = mod2 (<i>''m<sub>1</sub></i>'' + <i>''m<sub>0</sub></i>'' + <i>''m<sub>-1</sub></i>'')<br>
<i>''n<sub>2</sub></i>'' = mod2 (<i>''m<sub>0</sub></i>'' + <i>''m<sub>-1</sub></i>'')<br>
<i>''n<sub>3</sub></i>'' = mod2 (<i>''m<sub>1</sub></i>'' + <i>''m<sub>-1</sub></i>'')<br>
 
[[Image:Convolutional_encoder.png|frame|none|Img.1. Rate 1/3 non-recursive, non-systematic convolutional encoder with constraint length 3]]
Line 69:
 
===Decoding Convolutional Codes===
Several [[algorithm]]s exist for decoding convolutional codes. For relatively small values of <i>''k</i>'', the [[Viterbi algorithm]] is universally used as it provides [[maximum likelihood]] performance and is highly parallelizable. Viterbi decoders are thus easy to implement in [[VLSI]] hardware and in software on CPUs with [[SIMD]] instruction sets.
 
===Popular Convolutional Codes===
An especially popular Viterbi-decoded convolutional code, used at least since the [[Voyager program]] has a constraint length <i>''k</i>'' of 7 and a rate <i>''r</i>'' of 1/2.
 
* Longer constraint lengths produce more powerful codes, but the [[complexity]] of the Viterbi algorithm [[exponential growth|increases exponentially]] with constraint lengths, limiting these more powerful codes to deep space missions where the extra performance is easily worth the increased decoder complexity.
 
* [[Mars Pathfinder]], [[Mars Exploration Rover]] and the [[Cassini probe]] to Saturn use a <i>''k</i>'' of 15 and a rate of 1/6; this code performs about 2 dB better than the simpler <i>''k</i>''=7 code at a cost of 256x in decoding complexity (compared to Voyager mission codes).
 
===Perforated Convolutional Codes===
Line 116:
 
===Design Issues===
Longer constraint length codes are more practically decoded with any of several <b>'''sequential</b>''' decoding algorithms, of which the [[Shannon-Fano_coding|Fano]] algorithm is the best known. Unlike Viterbi decoding, sequential decoding is not maximum likelihood but its complexity increases only slightly with constraint length, allowing the use of strong, long-constraint-length codes. Such codes were used in the [[Pioneer program]] of the early 1970s to Jupiter and Saturn, but gave way to shorter, Viterbi-decoded codes, usually concatenated with large [[Reed-Solomon error correction]] codes that steepen the overall bit-error-rate curve and produce extremely low residual undetected error rates.
 
===Turbo Codes: replacing Convolutional Codes===