Noisy-channel coding theorem: Difference between revisions

Content deleted Content added
Seblx (talk | contribs)
m Mathematical statement: corrected my previous modification
Obviously this is what this was supposed to be.
Line 119:
# <math>\le 1 + P_e^{(n)}nR + nC</math> by the fact that capacity is maximized mutual information.
 
The result of these steps is that <math> P_e^{(n)} \ge 1 - \frac{1}{nR} - \frac{C}{R} </math>. As the block length <math>n</math> goes to infinity, we obtain <math> P_e^{(n)}</math> is bounded away from 100 if R is greater than C - we can get arbitrarily low rates of error only if R is less than C.
 
=== Strong converse for discrete memoryless channels ===