Graph entropy: Difference between revisions

Content deleted Content added
m General References: cite repair;
Citation bot (talk | contribs)
Altered journal. | Use this bot. Report bugs. | Suggested by Dominic3203 | Category:Information theory | #UCB_Category 118/202
 
(17 intermediate revisions by 13 users not shown)
Line 1:
In [[information theory]], the '''graph entropy''' is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused.<ref name="DehmerMowshowitz2013">{{cite book|author1=Matthias Dehmer|author2=Abbe Mowshowitz|author3=Frank Emmert-Streib|title=Advances in Network Complexity|url=https://books.google.com/books?id=fHxARaCPTKwC&pg=PT186|date=21 June 2013|publisher=John Wiley & Sons|isbn=978-3-527-67048-2|pages=186–}}</ref> This measure, first introduced by Körner in the 1970s,,<ref>{{cite journal|last=Körner|first=János||date=1973|title=Coding of an information source having ambiguous alphabet and the entropy of graphs.|journal=6th Prague conferenceConference on informationInformation theoryTheory|pages= 411-425411–425}}</ref><ref name="LoboKasparis2008">{{cite book|author1=Niels da Vitoria Lobo|author2=Takis Kasparis|author3=Michael Georgiopoulos|title=Structural, Syntactic, and Statistical Pattern Recognition: Joint IAPR International Workshop, SSPR & SPR 2008, Orlando, USA, December 4-6, 2008. Proceedings|url=https://books.google.com/books?id=DWZc9Ti_vBwC&pg=PA237|date=24 November 2008|publisher=Springer Science & Business Media|isbn=978-3-540-89688-3|pages=237–}}</ref> has since also proven itself useful in other settings, including combinatorics.<ref name="BouchonSaitta1988">{{cite book|author1=Bernadette Bouchon|author1-link= Bernadette Bouchon-Meunier |author2=Lorenza Saitta|author2-link= Lorenza Saitta |author3=Ronald R. Yager|title=Uncertainty and Intelligent Systems: 2nd International Conference on Information Processing and Management of Uncertainty in Knowledge Based Systems IPMU '88. Urbino, Italy, July 4-7, 1988. Proceedings|url=https://books.google.com/books?id=V5-6G433s_wC&pg=PA112|date=8 June 1988|publisher=Springer Science & Business Media|isbn=978-3-540-19402-6|pages=112–}}</ref>
 
==Definition==
Line 6:
::<math>H(G) = \min_{X,Y} I(X ; Y)</math>
 
where <math>X</math> is chosen [[UniformDiscrete uniform distribution|uniformly]] from <math>V</math>, <math>Y</math> isranges anover [[independentIndependent set (graph theory)|independent sets]] inof G, the joint distribution of <math>X</math> and <math>Y</math> is such that <math>X\in Y</math> with probability one, and <math>I(X ; Y)</math> is the [[mutual information]] of <math>X</math> and <math>Y</math>.<ref>G. Simonyi, "Perfect graphs and graph entropy. An updated survey," Perfect Graphs, John Wiley and Sons (2001) pp. 293-328, Definition 2”</ref>
 
That is, if we let <math>\mathcal{I}</math> denote the independent vertex sets in <math>G</math>, we wish to find the joint distribution <math>X,Y</math> on <math>V \times \mathcal{I}</math> with the lowest mutual information such that (i) the marginal distribution of the first term is uniform and (ii) in samples from the distribution, the second term contains the first term almost surely. The mutual information of <math>X</math> and <math>Y</math> is then called the entropy of <math>G</math>.
 
==Properties==
* Monotonicity. If <math>G_1</math> is a subgraph of <math>G_2</math> on the same vertex set, then <math>H(G_1) \leq H(G_2)</math>.
* Subadditivity. Given two graphs <math>G_1 = (V, E_1)</math> and <math>G_2 = (V, E_2)</math> on the same set of vertices, the [[Graph_operationsGraph operations#Binary_operationsBinary operations|graph union]] <math>G_1 \cup G_2 = (V, E_1 \cup E_2)</math> satisfies <math>H(G_1 \cup G_2) \leq H(G_1) + H(G_2)</math>.
* Arithmetic mean of disjoint unions. Let <math>G_1, G_2, \cdots, G_k</math> be a sequence of graphs on disjoint sets of vertices, with <math>n_1, n_2, \cdots, n_k</math> vertices, respectively. Then <math>H(G_1 \cup G_2 \cup \cdots G_k) = \tfrac{1}{\sum_{i=1}^{k}n_i}\sum_{i=1}^{k}{n_i H(G_i)}</math>.
 
Additionally, simple formulas exist for certain families classes of graphs.
* EdgeComplete balanced [[Multipartite graph|k-lesspartite graphs]] have entropy <math>0\log_2 k</math>. In particular,
** [[Complete graph|CompleteEdge-less graphs]] on <math>n</math> vertices have entropy <math>\lg n0</math>, where <math>\lg</math> is the [[binary logarithm]].
* Complete balanced* [[MultipartiteComplete graph|k-partite graphs]]s have entropyon <math>\lg rn</math> where <math>r</math> is the binary logarithm. In particular, complete balanced [[bipartite graphs]]vertices have entropy <math>1\log_2 n</math>.
** Complete balanced [[bipartite graphs]] have entropy <math>1</math>.
* Complete [[bipartite graphs]] with <math>n</math> vertices in one partition and <math>m</math> in the other have entropy <math>H\left(\frac{n}{m+n}\right)</math>, where <math>H</math> is the [[binary entropy function]].
 
==Example==
Here, we use properties of graph entropy to provide a simple proof that a complete graph <math>G</math> on <math>n</math> vertices cannot be expressed as the union of fewer than <math>\lglog_2 n</math> bipartite graphs.
 
''Proof'' By monotonicity, no bipartite graph can have graph entropy greater than that of a complete bipartite graph, which is bounded by <math>1</math>. Thus, by sub-additivity, the union of <math>k</math> bipartite graphs cannot have entropy greater than <math>k</math>. Now let <math>G = (V, E)</math> be a complete graph on <math>n</math> vertices. By the properties listed above, <math>H(G) = \lglog_2 n</math>. Therefore, the union of fewer than <math>\lglog_2 n</math> bipartite graphs cannot have the same entropy as <math>G</math>, so <math>G</math> cannot be expressed as such a union. <math>\blacksquare</math>
 
==General References==
*{{cite book|author1=Matthias Dehmer|author2=Frank Emmert-Streib|author3=Zengqiang Chen |author4=Xueliang Li |author5=Yongtang Shi |title=Mathematical Foundations and Applications of Graph Entropy|url=https://books.google.com/books?id=CZ-_DAAAQBAJ|date=25 July 2016|publisher=Wiley|isbn=978-3-527-69325-2}}
 
==Notes==
Line 29 ⟶ 35:
[[Category:Information theory]]
[[Category:Graph theory]]
 
==General References==
*{{cite book|author1=Matthias Dehmer|author2=Frank Emmert-Streib|author3=Zengqiang Chen |author4=Xueliang Li |author5=Yongtang Shi |title=Mathematical Foundations and Applications of Graph Entropy|url=https://books.google.com/books?id=CZ-_DAAAQBAJ|date=25 July 2016|publisher=Wiley|isbn=978-3-527-69325-2}}