Kolmogorov complexity: Difference between revisions

Content deleted Content added
No edit summary
mNo edit summary
Line 5:
 
 
To formalize the above definition of complexity, one has to specify exactly what types of programs are allowed. Fortunately, it doesn't really matter: one could take a particular notation for [[Turing machine|Turing machines]], or [[Lisp programming language|LISP]] programs, or [[Pascal programming language|Pascal]] programs, or [[Java virtual machine]] bytecode. If we agree to measure the lengths of all objects consistently in [[bit|bits]], then the resulting notions of complexity will only differ by a constant factor: if ''I''<sub>1</sub>(''s'') and ''I''<sub>2</sub>(''s'') are the complexititescomplexities of the string ''s'' according to two different programming languages ''L''<sub>1</sub> and ''L''<sub>2</sub>, then there are constants ''C'' and ''D'' (which only depend on the languages chosen, but not on ''s'') such that
:<math>I_1(s) \le C \cdot I_2 (s) + D </math>
Here, ''D'' is the length of an interpreter for ''L''<sub>2</sub> written in ''L''<sub>1</sub>, and ''C'' describes the overhead of storing programs written in ''L''<sub>2</sub> as part of programs in ''L''<sub>1</sub>.
Line 31:
polar co-ordinates to Cartesian co-ordinates), statistical consistency (even
for very hard problems, MML will converge to any underlying model) and efficiency (the MML model will converge to any true underlying model about as quickly as is possible). C.S. Wallace and D.L. Dowe showed a formal connection between MML and algorithmic information theory (or Kolmogorov complexity) in 1999.
 
 
==External links==