Content deleted Content added
No edit summary |
m clean up and formatting using AWB |
||
Line 1:
In [[information theory]], the '''graph entropy''' is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused.<ref name="DehmerMowshowitz2013">{{cite book|author1=Matthias Dehmer|author2=Abbe Mowshowitz|author3=Frank Emmert-Streib|title=Advances in Network Complexity|url=https://books.google.com/books?id=fHxARaCPTKwC&pg=PT186|date=21 June 2013|publisher=John Wiley & Sons|isbn=978-3-527-67048-2|pages=186–}}</ref> This measure, first introduced by Körner in the 1970s,<ref>{{cite journal|last=Körner|first=János
==Definition==
Line 10:
==Properties==
* Monotonicity. If <math>G_1</math> is a subgraph of <math>G_2</math> on the same vertex set, then <math>H(G_1) \leq H(G_2)</math>.
* Subadditivity. Given two graphs <math>G_1 = (V, E_1)</math> and <math>G_2 = (V, E_2)</math> on the same set of vertices, the [[
* Arithmetic mean of disjoint unions. Let <math>G_1, G_2, \cdots, G_k</math> be a sequence of graphs on disjoint sets of vertices, with <math>n_1, n_2, \cdots, n_k</math> vertices, respectively. Then <math>H(G_1 \cup G_2 \cup \cdots G_k) = \tfrac{1}{\sum_{i=1}^{k}n_i}\sum_{i=1}^{k}{n_i H(G_i)}</math>.
Additionally, simple formulas exist for certain families classes of graphs.
* Edge-less graphs have entropy <math>0</math>.
* [[Complete graph
* Complete balanced [[Multipartite graph|k-partite graphs]] have entropy <math>\lg k</math> where <math>lg</math> is the binary logarithm. In particular, complete balanced [[bipartite graphs]] have entropy <math>1</math>.
* Complete [[bipartite graphs]] with <math>n</math> vertices in one partition and <math>m</math> in the other have entropy <math>H\left(\frac{n}{m+n}\right)</math>, where <math>H</math> is the [[binary entropy function]].
|