Content deleted Content added
Line 222:
::::::Um. Agree that it needs to be sourced. But a definition of ''algorithm'' that doesn't allow for I/O is seriously broken.
::::::The problem with referring to an algorithm as "something that runs on a Turing machine" is that it focuses too much attention on the details of TMs, which are beside the point. It's really ''programs'' that run on Turing machines, not algorithms. An algorithm is supposed to be a higher level of abstraction — you can change details of the program, without changing the algorithm it implements. --[[User:Trovatore|Trovatore]] ([[User talk:Trovatore|talk]]) 22:52, 20 March 2010 (UTC)
:It's just a figure of speech. Instead of saying "the algorithm can run forever" you could say "the algorithm takes ''n'' as input and returns, as output, ''s''<sub>''n''</sub>". If the sequence depends on some other input ''k'', the algorithm takes both ''k'' and ''n'' as inputs. So there is not actually any individual computation of a natural number that runs forever; there is just a computable function that enumerates the sequence ''s''<sub>''n''</sub> as its range. All these English problems disappear as soon as one makes the slightest effort to replace "algorithm" with "computable function" as would usually be done in a recursion theory book. But the translation between "the algorithm takes no input, enumerates the sequence ''s''<sub>''n''</sub>, and never halts" and "the algorithm takes ''n'' as input and returns ''s''<sub>''n''</sub>" is so direct that using the first as a convenient way of expressing the second is extremely common and causes no problems in practice. — Carl <small>([[User:CBM|CBM]] · [[User talk:CBM|talk]])</small> 02:30, 21 March 2010 (UTC)
|