Content deleted Content added
ClueBot NG (talk | contribs) m Reverting possible vandalism by 37.191.112.252 to version by Materialscientist. Report False Positive? Thanks, ClueBot NG. (4211733) (Bot) |
Tags: Mobile edit Mobile app edit Android app edit |
||
Line 149:
The [[information entropy|entropy]] ''H'' (in bits) is the weighted sum, across all symbols {{math|''a''<sub>''i''</sub>}} with non-zero probability {{math|''w''<sub>''i''</sub>}}, of the information content of each symbol:
:<math display="block"> H(A) = \sum_{w_i > 0} w_i h(a_i) = \sum_{w_i > 0} w_i \log_2{1 \over w_i} = - \sum_{w_i > 0} w_i \log_2{w_i}. </math>
(Note: A symbol with zero probability has zero contribution to the entropy, since <math>\lim_{w \to 0^+} w \log_2 w = 0</math>. So for simplicity, symbols with zero probability can be left out of the formula above.)
|