Content deleted Content added
Scottcraig (talk | contribs) No edit summary |
|||
Line 151:
What kind of machine is used to measure the number of steps or amount of memory required to solve a problem? A [[Random access stored program machine|RASP machine]]? I don't think it's a [[Turing machine]]. --[[User:P3d0|P3d0]] ([[User talk:P3d0|talk]]) 05:42, 18 December 2007 (UTC)
:The definitions of major classes are explicitly designed to be robust against "reasonable" changes in the definition of the underlying abstract machine. In typical presentations various types of Turing machines are used but it doesn't matter - RAM models, cache-oblivious models, pretty much everything works fine. That's why we discuss P and not, say, DTIME(n<sup>3</sup>). [[User:Dcoetzee|Dcoetzee]] 20:20, 18 December 2007 (UTC)
== Efficiency and Scalability ==
I just changed the introduction back to emphasize scalability as the major component of complexity theory. Many say that this is the same as "efficiency", and the meaning of that term is becoming to mean the same in this field, but in lay terms they are still different.
For example, on small lists, Bubble Sort is more efficient than Hash Sort. But to a complexity theorist, Hash Sort is better, and he might say more "efficient." But he really means that it is more efficient on larger lists, in other words, more scalable.
Also, introductions are supposed to relate the topic to the larger world. So it is important to retain the paragraph about real world implications of the theory.[[User:Scottcraig|Scottcraig]] ([[User talk:Scottcraig|talk]]) 18:54, 14 January 2008 (UTC)
|