Content deleted Content added
Citation bot (talk | contribs) Altered journal. | Use this bot. Report bugs. | Suggested by Dominic3203 | Category:Information theory | #UCB_Category 118/202 |
|||
(2 intermediate revisions by 2 users not shown) | |||
Line 1:
In [[information theory]], the '''graph entropy''' is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused.<ref name="DehmerMowshowitz2013">{{cite book|author1=Matthias Dehmer|author2=Abbe Mowshowitz|author3=Frank Emmert-Streib|title=Advances in Network Complexity|url=https://books.google.com/books?id=fHxARaCPTKwC&pg=PT186|date=21 June 2013|publisher=John Wiley & Sons|isbn=978-3-527-67048-2|pages=186–}}</ref> This measure, first introduced by Körner in the 1970s,<ref>{{cite journal|last=Körner|first=János|date=1973|title=Coding of an information source having ambiguous alphabet and the entropy of graphs.|journal=6th Prague
==Definition==
Line 16:
Additionally, simple formulas exist for certain families classes of graphs.
*
** Edge-less graphs have entropy <math>0</math>.
** [[Complete graph]]s on <math>n</math> vertices have entropy <math>\log_2 n</math>. ** Complete
* Complete [[bipartite graphs]] with <math>n</math> vertices in one partition and <math>m</math> in the other have entropy <math>H\left(\frac{n}{m+n}\right)</math>, where <math>H</math> is the [[binary entropy function]].
|