Content deleted Content added
fix typo |
No edit summary |
||
Line 29:
Similar ideas are used to prove the properties of [[Chaitin's constant]].
The [[minimum message length]] principle of statistical and inductive inference and machine learning was independently developed by [ http://www.csse.monash.edu.au/~dld/CSWallacePublications/ C.S. Wallace] and D.M. Boulton in 1968. MML is [[Bayesian probability|Bayesian]] (it incorporates prior beliefs) and
information-theoretic. It has the desirable properties of statistical
invariance (the inference transforms with a re-parameterisation, such as from
polar coordinates to Cartesian coordinates), statistical consistency (even
for very hard problems, MML will converge to any underlying model) and efficiency (the MML model will converge to any true underlying model about as quickly as is possible). [http://www.csse.monash.edu.au/~dld/CSWallacePublications/ C.S. Wallace] and D.L. Dowe showed a formal connection between MML and algorithmic information theory (or Kolmogorov complexity) in 1999.
==See also==
|