Kolmogorov complexity: Difference between revisions

Content deleted Content added
No edit summary
No edit summary
Line 25:
 
Similar ideas are used to prove the properties of [[Chaitins constant|Chaitin's constant]]
 
The Minimum Message Length (MML) principle of statistical and inductive
inference and machine learning was independently developed by C.S. Wallace and D.M. Boulton in 1968. MML is Bayesian (it incorporates prior beliefs) and
information-theoretic. It has the desirable properties of statistical
invariance (the inference transforms with a re-parameterisation, such as from
polar co-ordinates to Cartesian co-ordinates), statistical consistency (even
for very hard problems, MML will converge to any underlying model) and efficiency (the MML model will converge to any true underlying model about as quickly as is possible). C.S. Wallace and D.L. Dowe showed a formal connection
between MML and Kolmogorov complexity in 1999.
 
----