In [[information theory]], the '''noisy-channel coding theorem''' establishes that however contaminated with noise interference a communication channel may be, it is possible to communicate digital data ([[information]]) error-free up to a given maximum rate through the channel. This surprising result, sometimes called the '''fundamental theorem of information theory''', or just '''Shannon's theorem''', was first presented by [[Claude Shannon]] in [[1948]].
The '''Shannon limit''' or '''Shannon capacity''' of a communications channel is the theoretical maximum information transfer rate of the channel, for a particular noise level.