Content deleted Content added
No edit summary |
|||
Line 151:
:<math> H(A) = \sum_{w_i > 0} w_i h(a_i) = \sum_{w_i > 0} w_i \log_2{1 \over w_i} = - \sum_{w_i > 0} w_i \log_2{w_i}. </math>
(Note: A symbol with zero probability has zero contribution to the entropy, since <math>\lim_{w \to 0^+} w \log_2 w = 0</math>. So for simplicity, symbols with zero probability can be left out of the formula above.)
As a consequence of [[Shannon's source coding theorem]], the entropy is a measure of the smallest codeword length that is theoretically possible for the given alphabet with associated weights. In this example, the weighted average codeword length is 2.25 bits per symbol, only slightly larger than the calculated entropy of 2.205 bits per symbol. So not only is this code optimal in the sense that no other feasible code performs better, but it is very close to the theoretical limit established by Shannon.
|