Chain rule for Kolmogorov complexity: Difference between revisions

Content deleted Content added
+Category:Articles containing proofs
Hadrianheugh (talk | contribs)
Key links added
Line 1:
The [[chain rule]] for [[Kolmogorov complexity]] is an analogue of the chain rule for [[Information entropy]], which states:
 
:<math>
Line 5:
</math>
 
That is, the combined [[randomness]] of two sequences ''X'' and ''Y'' is the sum of the randomness of ''X'' plus whatever randomness is left in ''Y'' once we know ''X''.
This follows immediately from the definitions of [[conditional entropy|conditional]] and [[joint entropy]] fact from [[probability theory]] that the [[joint probability]] is the product of the [[marginal probability|marginal]] and [[conditional probability]]: