Noisy-channel coding theorem: Difference between revisions

Content deleted Content added
Corrected the inequality in the converse of the noisy-channel coding theorem
Line 37:
:and <math>H_2(p_b)</math> is the ''[[binary entropy function]]''
 
::<math>H_2(p_b)=p_b \log \left(\frac{1}{p_b}\right) + (1-p_b) \log \left(\frac{1}{1-p_b}\right)</math>
 
:3. For any ''p<sub>b</sub>'', rates greater than ''R(p<sub>b</sub>)'' are not achievable.