Content deleted Content added
m Missing punctuation added |
Citation bot (talk | contribs) Altered journal. | Use this bot. Report bugs. | Suggested by Dominic3203 | Category:Information theory | #UCB_Category 118/202 |
||
(10 intermediate revisions by 8 users not shown) | |||
Line 1:
In [[information theory]], the '''graph entropy''' is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused.<ref name="DehmerMowshowitz2013">{{cite book|author1=Matthias Dehmer|author2=Abbe Mowshowitz|author3=Frank Emmert-Streib|title=Advances in Network Complexity|url=https://books.google.com/books?id=fHxARaCPTKwC&pg=PT186|date=21 June 2013|publisher=John Wiley & Sons|isbn=978-3-527-67048-2|pages=186–}}</ref> This measure, first introduced by Körner in the 1970s,<ref>{{cite journal|last=Körner|first=János
==Definition==
Line 6:
::<math>H(G) = \min_{X,Y} I(X ; Y)</math>
where <math>X</math> is chosen [[Discrete uniform distribution|uniformly]] from <math>V</math>, <math>Y</math> ranges over [[Independent set (graph theory)|independent sets]] of G, the joint distribution of <math>X</math> and <math>Y</math> is such that <math>X\in Y</math> with probability one, and <math>I(X ; Y)</math> is the [[mutual information]] of <math>X</math> and <math>Y</math>.<ref>G. Simonyi, "Perfect graphs and graph entropy. An updated survey," Perfect Graphs, John Wiley and Sons (2001) pp. 293-328, Definition 2”</ref>
That is, if we let <math>\mathcal{I}</math> denote the independent vertex sets in <math>G</math>, we wish to find the joint distribution <math>X,Y</math> on <math>V \times \mathcal{I}</math> with the lowest mutual information such that (i) the marginal distribution of the first term is uniform and (ii) in samples from the distribution, the second term contains the first term almost surely. The mutual information of <math>X</math> and <math>Y</math> is then called the entropy of <math>G</math>.
==Properties==
* Monotonicity. If <math>G_1</math> is a subgraph of <math>G_2</math> on the same vertex set, then <math>H(G_1) \leq H(G_2)</math>.
* Subadditivity. Given two graphs <math>G_1 = (V, E_1)</math> and <math>G_2 = (V, E_2)</math> on the same set of vertices, the [[
* Arithmetic mean of disjoint unions. Let <math>G_1, G_2, \cdots, G_k</math> be a sequence of graphs on disjoint sets of vertices, with <math>n_1, n_2, \cdots, n_k</math> vertices, respectively. Then <math>H(G_1 \cup G_2 \cup \cdots G_k) = \tfrac{1}{\sum_{i=1}^{k}n_i}\sum_{i=1}^{k}{n_i H(G_i)}</math>.
Additionally, simple formulas exist for certain families classes of graphs.
*
**
*
** Complete balanced [[bipartite graphs]] have entropy <math>1</math>.
* Complete [[bipartite graphs]] with <math>n</math> vertices in one partition and <math>m</math> in the other have entropy <math>H\left(\frac{n}{m+n}\right)</math>, where <math>H</math> is the [[binary entropy function]].
==Example==
Here, we use properties of graph entropy to provide a simple proof that a complete graph <math>G</math> on <math>n</math> vertices cannot be expressed as the union of fewer than <math>\
''Proof'' By monotonicity, no bipartite graph can have graph entropy greater than that of a complete bipartite graph, which is bounded by <math>1</math>. Thus, by sub-additivity, the union of <math>k</math> bipartite graphs cannot have entropy greater than <math>k</math>. Now let <math>G = (V, E)</math> be a complete graph on <math>n</math> vertices. By the properties listed above, <math>H(G) = \
==General References==▼
*{{cite book|author1=Matthias Dehmer|author2=Frank Emmert-Streib|author3=Zengqiang Chen |author4=Xueliang Li |author5=Yongtang Shi |title=Mathematical Foundations and Applications of Graph Entropy|url=https://books.google.com/books?id=CZ-_DAAAQBAJ|date=25 July 2016|publisher=Wiley|isbn=978-3-527-69325-2}}▼
==Notes==
Line 29 ⟶ 35:
[[Category:Information theory]]
[[Category:Graph theory]]
▲==General References==
▲*{{cite book|author1=Matthias Dehmer|author2=Frank Emmert-Streib|author3=Zengqiang Chen |author4=Xueliang Li |author5=Yongtang Shi |title=Mathematical Foundations and Applications of Graph Entropy|url=https://books.google.com/books?id=CZ-_DAAAQBAJ|date=25 July 2016|publisher=Wiley|isbn=978-3-527-69325-2}}
|