Content deleted Content added
Tags: Mobile edit Mobile web edit |
Tried to add some clarification to the definition. The types of the variables may not have been obvious to a first time reader. |
||
Line 6:
::<math>H(G) = \min_{X,Y} I(X ; Y)</math>
where <math>X</math> is chosen [[Discrete uniform distribution|uniformly]] from <math>V</math>, <math>Y</math> ranges over [[Independent set (graph theory)|independent sets]] of G, the joint distribution of <math>X</math> and <math>Y</math> is such that <math>X\in Y</math> with probability one, and <math>I(X ; Y)</math> is the [[mutual information]] of <math>X</math> and <math>Y</math>.<ref>G. Simonyi, "Perfect graphs and graph entropy. An updated survey," Perfect Graphs, John Wiley and Sons (2001) pp. 293-328, Definition 2”</ref>
That is, if we let <math>\mathcal{I}</math> denote the independent vertex sets in <math>G</math>, we wish to find the joint distribution <math>X,Y</math> on <math>V \times \mathcal{I}</math> with the lowest mutual information such that (i) the marginal distribution of the first term is uniform and (ii) in samples from the distribution, the second term contains the first term almost surely. The mutual information of <math>X</math> and <math>Y</math> is then called the entropy of <math>G</math>.
==Properties==
|