Kolmogorov complexity: Difference between revisions

Content deleted Content added
No edit summary
rm redundant extlinks
Line 26:
Similar ideas are used to prove the properties of [[Chaitins constant|Chaitin's constant]]
 
The [[minimum message length]] ([http://www.csse.monash.edu.au/~dld/MML.html MML]) principle of statistical and inductive inference and machine learning was independently developed by [http://www.csse.monash.edu.au/cgi-bin/pub_search?publication_type=0&year=&authors=wallace&title= C.S. Wallace] and D.M. Boulton in 1968. MML is Bayesian (it incorporates prior beliefs) and
information-theoretic. It has the desirable properties of statistical
invariance (the inference transforms with a re-parameterisation, such as from
polar co-ordinates to Cartesian co-ordinates), statistical consistency (even
for very hard problems, MML will converge to any underlying model) and efficiency (the MML model will converge to any true underlying model about as quickly as is possible). [http://www.csse.monash.edu.au/cgi-bin/pub_search?publication_type=0&year=&authors=wallace&title= C.S. Wallace] and [http://www.csse.monash.edu.au/~dld D.L. Dowe] showed a formal connection between MML and algorithmic information theory (or Kolmogorov complexity) in 1999.
 
 
==External links==
 
----
'''See also:'''
* [http://www.cs.umaine.edu/~chaitin/ Chaitin's online publications]
* [http://www.idsia.ch/~juergen/ray.html Solomonoff's IDSIA page]