Content deleted Content added
Tag: section blanking |
m Reverted edits by 14.139.123.193 (talk): unexplained content removal (HG) (3.4.9) |
||
Line 183:
which is within one bit of the entropy.
===Expected word length===
For Shannon's method, the word lengths satisfy
:<math>l_i = \lceil -\log_2 p_i \rceil \leq -\log_2 p_i + 1 .</math>
Hence the expected word length satisfies
:<math>\mathbb E L = \sum_{i=1}^n p_il_i \leq \sum_{i=1}^n p_i (-\log_2 p_i + 1) = -\sum_{i=1}^n p_i \log_2 p_i + \sum_{i=1}^n p_i = H(X) + 1.</math>
Here, <math>H(X) = - \sum_{i=1}^n p_i \log_2 p_i</math> is the [[Entropy (information theory)|entropy]], and [[Shannon's source coding theorem]] says that any code must have an average length of at least <math>H(X)</math>. Hence we see that the Shannon–Fano code is always within one bit of the optimal expected word length.
==Fano's code: binary splitting==
|