Talk:Algorithm characterizations: Difference between revisions

Content deleted Content added
Line 220:
:::This bleeding thing keeps coming up; we really should document things somewhere and be able to point to it. The fact is that there's an older sense of the word ''algorithm'' that requires it to halt in finite time on every input. However this restriction tends to get in the way more often than it clarifies, and I don't believe the word is very often used in this strict sense anymore. Of course "not very often" is not the same as "not" (and for that matter I'm not even all that sure about "not very often"), so one way or another we need to deal with the ambiguity wherever it comes up — we can't just silently pick one definition or the other, because that would confuse readers expecting the other definition. --[[User:Trovatore|Trovatore]] ([[User talk:Trovatore|talk]]) 21:31, 20 March 2010 (UTC)
::::Some of the algorithm characterizations in this article allow an algorithm to be partial recursive, i.e. on some inputs the algorithm might run forever while never producing any output. This question is different--whether an algorithm can run forever and produce an infinite amount of output. One can think of an algorithm as something that runs on a Turing machine, and a Turing machine as something that computes an abstract function but cannot do I/O. I/O can only happen through some external mechanism after the Turing machine enters a halt state. Some programming languages are organized around that concept, e.g. Haskell's I/O monad can be viewed that way. I agree that "algorithm" is still a reasonable common-sense term, but we ought to source such a usage, and we haven't yet done so as far as I can tell. [[Special:Contributions/66.127.52.47|66.127.52.47]] ([[User talk:66.127.52.47|talk]]) 21:53, 20 March 2010 (UTC)
::::::Um. Agree that it needs to be sourced. But a definition of ''algorithm'' that doesn't allow for I/O is seriously broken.
::::::The problem with referring to an algorithm as "something that runs on a Turing machine" is that it focuses too much attention on the details of TMs, which are beside the point. It's really ''programs'' that run on Turing machines, not algorithms. An algorithm is supposed to be a higher level of abstraction — you can change details of the program, without changing the algorithm it implements. --[[User:Trovatore|Trovatore]] ([[User talk:Trovatore|talk]]) 22:52, 20 March 2010 (UTC)