Error detection and correction: Difference between revisions

Content deleted Content added
No edit summary
Tag: Reverted
Line 48:
* ''Block codes'' are processed on a [[Block (telecommunications)|block-by-block]] basis. Early examples of block codes are [[repetition code]]s, [[Hamming code]]s and [[multidimensional parity-check code]]s. They were followed by a number of efficient codes, [[Reed–Solomon code]]s being the most notable due to their current widespread use. [[Turbo code]]s and [[low-density parity-check code]]s (LDPC) are relatively new constructions that can provide almost [[:Category:Capacity-approaching codes|optimal efficiency]].
 
[[Noisy-channel coding theorem|Shannon's noisy-channel coding theorem]] is an important theorem in forward error correction, and describes the maximum [[information rate]] at which reliable communication is possible over a channel that has a certain error probability or [[signal-to-noise ratio]] (SNR). This strict upper limit is expressed in terms of the [[channel capacity]]. More specifically, the theorem says that there exist codes such that with increasing encoding length the probability of error on a [[channel model|discrete memoryless channel]] can be made arbitrarily small, provided that the [[code rate]] is smaller than the channel capacity. The code rate is defined as the fraction ''k/n'' of ''k'' source symbols and ''n'' encoded symbols.
 
The actual maximum code rate allowed depends on the error-correcting code used, and may be lower. This is because Shannon's proof was only of existential nature, and did not show how to construct codes that are both optimal and have [[polynomial time|efficient]] encoding and decoding algorithms.