Kolmogorov complexity: Difference between revisions

Content deleted Content added
Tromp (talk | contribs)
punctuation. Also, this link to Chaitin's constant must have been put there when apostrophe's could not be used in article titles.
Line 5:
The field was developed by [[Andrey Kolmogorov]], [[Ray Solomonoff]] and [[Gregory Chaitin]] starting in the late [[1960s]]. There are several
variants of Kolmogorov complexity or algorithmic information. The most widely used one is based on self-delimiting programs and is due to [[Leonid Levin]] (1974).
 
 
To formalize the above definition of complexity, one has to specify exactly what types of programs are allowed. Fortunately, it doesn't really matter: one could take a particular notation for [[Turing machine|Turing machines]], or [[Lisp programming language|LISP]] programs, or [[Pascal programming language|Pascal]] programs, or [[Java virtual machine]] bytecode.
Line 31 ⟶ 30:
can be proven within the axiomatic system (even though, as we know, the vast majority of those statements must be true). Again, the proof of this result is a formalization of Berry's paradox.
 
Similar ideas are used to prove the properties of [[Chaitins constant|Chaitin's constant]].
 
The [[minimum message length]] principle of statistical and inductive inference and machine learning was independently developed by C.S. Wallace and D.M. Boulton in 1968. MML is [[Bayesian probability|Bayesian]] (it incorporates prior beliefs) and