Graph entropy: Difference between revisions

Content deleted Content added
No edit summary
m clean up and formatting using AWB
Line 1:
In [[information theory]], the '''graph entropy''' is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused.<ref name="DehmerMowshowitz2013">{{cite book|author1=Matthias Dehmer|author2=Abbe Mowshowitz|author3=Frank Emmert-Streib|title=Advances in Network Complexity|url=https://books.google.com/books?id=fHxARaCPTKwC&pg=PT186|date=21 June 2013|publisher=John Wiley & Sons|isbn=978-3-527-67048-2|pages=186–}}</ref> This measure, first introduced by Körner in the 1970s,<ref>{{cite journal|last=Körner|first=János||date=1973|title=Coding of an information source having ambiguous alphabet and the entropy of graphs.|journal=6th Prague conference on information theory|pages= 411–425}}</ref><ref name="LoboKasparis2008">{{cite book|author1=Niels da Vitoria Lobo|author2=Takis Kasparis|author3=Michael Georgiopoulos|title=Structural, Syntactic, and Statistical Pattern Recognition: Joint IAPR International Workshop, SSPR & SPR 2008, Orlando, USA, December 4-6, 2008. Proceedings|url=https://books.google.com/books?id=DWZc9Ti_vBwC&pg=PA237|date=24 November 2008|publisher=Springer Science & Business Media|isbn=978-3-540-89688-3|pages=237–}}</ref> has since also proven itself useful in other settings, including combinatorics.<ref name="BouchonSaitta1988">{{cite book|author1=Bernadette Bouchon|author2=Lorenza Saitta|author3=Ronald R. Yager|title=Uncertainty and Intelligent Systems: 2nd International Conference on Information Processing and Management of Uncertainty in Knowledge Based Systems IPMU '88. Urbino, Italy, July 4-7, 1988. Proceedings|url=https://books.google.com/books?id=V5-6G433s_wC&pg=PA112|date=8 June 1988|publisher=Springer Science & Business Media|isbn=978-3-540-19402-6|pages=112–}}</ref>
 
==Definition==
Line 10:
==Properties==
* Monotonicity. If <math>G_1</math> is a subgraph of <math>G_2</math> on the same vertex set, then <math>H(G_1) \leq H(G_2)</math>.
* Subadditivity. Given two graphs <math>G_1 = (V, E_1)</math> and <math>G_2 = (V, E_2)</math> on the same set of vertices, the [[Graph_operationsGraph operations#Binary_operationsBinary operations|graph union]] <math>G_1 \cup G_2 = (V, E_1 \cup E_2)</math> satisfies <math>H(G_1 \cup G_2) \leq H(G_1) + H(G_2)</math>.
* Arithmetic mean of disjoint unions. Let <math>G_1, G_2, \cdots, G_k</math> be a sequence of graphs on disjoint sets of vertices, with <math>n_1, n_2, \cdots, n_k</math> vertices, respectively. Then <math>H(G_1 \cup G_2 \cup \cdots G_k) = \tfrac{1}{\sum_{i=1}^{k}n_i}\sum_{i=1}^{k}{n_i H(G_i)}</math>.
 
Additionally, simple formulas exist for certain families classes of graphs.
* Edge-less graphs have entropy <math>0</math>.
* [[Complete graph|Complete graphs]]s on <math>n</math> vertices have entropy <math>\lg n</math>, where <math>\lg</math> is the [[binary logarithm]].
* Complete balanced [[Multipartite graph|k-partite graphs]] have entropy <math>\lg k</math> where <math>lg</math> is the binary logarithm. In particular, complete balanced [[bipartite graphs]] have entropy <math>1</math>.
* Complete [[bipartite graphs]] with <math>n</math> vertices in one partition and <math>m</math> in the other have entropy <math>H\left(\frac{n}{m+n}\right)</math>, where <math>H</math> is the [[binary entropy function]].