Noisy-channel coding theorem: Difference between revisions

Content deleted Content added
Iseetho (talk | contribs)
Iseetho (talk | contribs)
Line 88:
*Also from the AEP, we know the probability that a particular <math>X_1^n(i)</math> and the <math>Y_1^n</math> resulting from W = 1 are jointly typical is <math>\le 2^{-n(I(X;Y) - 3\epsilon)}</math>. Thus,
 
Define
Define <math>E_i = \{(X_1^n(i), Y_1^n) \in A_\epsilon^{(n)}\}, i = 1, 2, ..., 2^{nR}</math> as the event that some other message i is jointly typical with the sequence received when message 1 is sent.
 
<math>E_i = \{(X_1^n(i), Y_1^n) \in A_\epsilon^{(n)}\}, i = 1, 2, ..., 2^{nR}</math>
 
Define <math>E_i = \{(X_1^n(i), Y_1^n) \in A_\epsilon^{(n)}\}, i = 1, 2, ..., 2^{nR}</math> as the event that some other message i is jointly typical with the sequence received when message 1 is sent.
 
<math>P(error) = P(error|W=1) \le P(E_1^c) + \sum_{i=2}^{2^{nR}}P(E_i)</math>