Content deleted Content added
m The code rate could be 100%, which means no error correction |
link to exponential growth |
||
Line 3:
Convolutional codes are often used to improve the performance of [[radio]] and [[satellite]] links.
Several [[algorithm]]s exist for decoding convolutional codes. For relatively small values of <i>k</i>, the [[Viterbi algorithm]] is universally used as it provides [[maximum likelihood]] performance and is highly parallelizable. Viterbi decoders are thus easy to implement in [[VLSI]] hardware and in software on CPUs with [[SIMD]] instruction sets. An especially popular Viterbi-decoded convolutional code, used at least since the [[Voyager program]] has a constraint length <i>k</i> of 7 and a rate <i>r</i> of 1/2. Longer constraint lengths produce more powerful codes, but the [[complexity]] of the Viterbi algorithm [[exponential growth|increases exponentially]] with constraint lengths, limiting these more powerful codes to deep space missions where the extra performance is easily worth the increased decoder complexity. [[Mars Pathfinder]], [[Mars Exploration Rover]] and the [[Cassini probe]] to Saturn use a <i>k</i> of 15 and a rate of 1/6; this code performs about 2 dB better than the simpler <i>k</i>=7 code at a cost of 256x in decoding complexity.
Longer constraint length codes are more practically decoded with any of several <b>sequential</b> decoding algorithms, of which the Fano algorithm is the best known. Unlike Viterbi decoding, sequential decoding is not maximum likelihood but its complexity increases only slightly with constraint length, allowing the use of strong, long-constraint-length codes. Such codes were used in the [[Pioneer program]] of the early 1970s to Jupiter and Saturn, but gave way to shorter, Viterbi-decoded codes, usually concatenated with large [[Reed-Solomon error correction]] codes that steepen the overall bit-error-rate curve and produce extremely low residual undetected error rates.
|