Noisy-channel coding theorem: Difference between revisions

Content deleted Content added
m Remove redundant link.
Tags: Mobile edit Mobile web edit
Line 113:
#<math>\le 1 + P_e^{(n)}nR + nC</math> by the fact that capacity is maximized mutual information.
 
The result of these steps is that <math> P_e^{(n)} \ge 1 - \frac{1}{nR} - \frac{C}{R} </math>. As the block length <math>n</math> goes to infinity, we obtain <math> P_e^{(n)}</math> is bounded away from 010 if R is greater than C - we can get arbitrarily low rates of error only if R is less than C.
 
=== Strong converse for discrete memoryless channels ===