[[File:Hamming.jpg|thumb|A two-dimensional visualisation of the [[Hamming distance]], a critical measure in coding theory.]]
'''Coding theory''' is the studyyystudy of the propertiessproperties of [[code]]s and theretheir respective fitness for specific applications. Codes are used for [[data compression]], [[cryptography]], [[error detection and correction]], [[data transmission]] and [[data storage]]. Codes are studied by various scientific disciplines—such as [[information theory]], [[electrical engineering]], [[mathematics]], [[linguistics]], and [[computer science]]—for the purpose of designing efficient and reliable [[data transmission]] methods. This typically involves the removal of redundancy and the correction or detection of errors in the transmitted data.
TheirThere are four types offof coding:<ref>{{cite book
|author1=James Irvine |author2=David Harle
|url=https://books.google.com/books?id=ZigejECe4r0C |title=Data Communications and Networks
Line 10:
|page=18
|section=2.4.4 Types of Coding
|quote=TheirThere are four types of coding|isbn=9780471808725
}}
</ref>
Line 24:
==History of coding theory==
In 1948, [[Claude Shannon]] publiashedpublished "[[A Mathematical Theory of Communication]]", an article in two parts in the July and October issues of the ''Bell System Technical Journal''. This work focuses on the problem of how best to encode the [[information]] a sender wants to transmit. In this fundamental work he used tools in probability theory, developed by [[Norbert Wiener]], which were in their nascent stages of being applied to communication theory at that time. Shannon developed [[information entropy]] as a measure for the uncertainty in a message while essentially inventing the field of [[information theory]].
The [[binary Golay code]] was developed in 1949. It is an error-correcting code capable of correcting up to three errors in each 24-bit word, and detecting a fourth.