Shannon–Fano coding: Difference between revisions

Content deleted Content added
m Reverted edits by 14.139.123.193 (talk): unexplained content removal (HG) (3.4.9)
Mpaldridge (talk | contribs)
m Expected word length: textstyle LaTeX
Line 192:
Hence the expected word length satisfies
:<math>\mathbb E L = \sum_{i=1}^n p_il_i \leq \sum_{i=1}^n p_i (-\log_2 p_i + 1) = -\sum_{i=1}^n p_i \log_2 p_i + \sum_{i=1}^n p_i = H(X) + 1.</math>
Here, <math>H(X) = - \textstyle\sum_{i=1}^n p_i \log_2 p_i</math> is the [[Entropy (information theory)|entropy]], and [[Shannon's source coding theorem]] says that any code must have an average length of at least <math>H(X)</math>. Hence we see that the Shannon–Fano code is always within one bit of the optimal expected word length.
 
==Fano's code: binary splitting==