Content deleted Content added
Citation bot (talk | contribs) Add: bibcode, authors 1-1. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Abductive | Category:Articles containing proofs | #UCB_Category 116/409 |
DavidBrooks (talk | contribs) Reverted 1 edit by Dragonthereal (talk): Editorializing |
||
(5 intermediate revisions by 5 users not shown) | |||
Line 1:
{{Short description|Lower bound for size of software program}}
{{inline|date=July 2014}}
The chain rule{{cn|date=July 2014}} for [[Kolmogorov complexity]] is an analogue of the chain rule for [[information entropy]], which states:
Line 22 ⟶ 23:
</math>
(An exact version, {{math|1=''KP''(''x'',
holds for the prefix complexity ''KP'', where
It states that the shortest program printing ''X'' and ''Y'' is obtained by concatenating a shortest program printing ''X'' with a program printing ''Y'' given ''X'', plus [[Big-O notation|at most]] a logarithmic factor. The results implies that [[Mutual information#Absolute mutual information|algorithmic mutual information]], an analogue of mutual information for Kolmogorov complexity is symmetric:
==Proof==
Line 31 ⟶ 32:
The ≤ direction is obvious: we can write a program to produce ''x'' and ''y'' by concatenating a program to produce ''x'', a program to produce ''y'' given
access to ''x'', and (whence the log term) the length of one of the programs, so
that we know where to separate the two programs for ''x'' and {{math|1=''y''
For the ≥ direction, it suffices to show that for all {{mvar|k,l}} such that {{tmath|1=k+l = K(x,y)}} we have that either
or
Consider the list (''
* contains the pair
* can be [[recursively enumerable|enumerated]] given
* has at most
First, suppose that ''x'' appears less than
Now, suppose that ''x'' appears at least
==References==
Line 60 ⟶ 61:
| isbn = 0-387-94868-6 }}
* {{cite journal | last=Kolmogorov | first=A. | title=Logical basis for information theory and probability theory | journal=IEEE Transactions on Information Theory | publisher=Institute of Electrical and Electronics Engineers (IEEE) | volume=14 | issue=5 | year=1968 | issn=0018-9448 | doi=10.1109/tit.1968.1054210 | pages=662–664| s2cid=11402549 }}
* {{cite journal | last1=Zvonkin | first1=A K | last2=Levin | first2=L A | title=The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms | journal=Russian Mathematical Surveys | publisher=IOP Publishing | volume=25 | issue=6 | date=1970-12-31 | issn=0036-0279 | doi=10.1070/rm1970v025n06abeh001269 | pages=83–124| bibcode=1970RuMaS..25...83Z | s2cid=250850390 }}
[[Category:Algorithmic information theory|*]]
|