Content deleted Content added
m source added |
m WP:ISBN |
||
Line 18:
To analyze an algorithm precisely, one must assume it is being executed by a particular [[abstract machine]]. For example, on a [[random access machine]], [[binary search algorithm|binary search]] can be used to rapidly locate a particular value in a sorted list in only O(log ''n'') comparisons, where ''n'' is the number of elements in the list; on a [[Turing machine]], this is not possible, since it can only move one memory cell at a time and so requires Ω(''n'') steps to even reach an arbitrary value in the list.
Moreover, different abstract machines define different ''primitive'' operations, which are operations that can be performed in constant time. Some machines, like Turing machines, only permit one bit at a time to be read or written; these are called bit operations, and the number of bit operations required to solve a problem is called its '''bit complexity'''.<ref>{{Cite web|title=On the bit complexity of finding points in connected components of a smooth real hypersurface {{!}} Proceedings of the 45th International Symposium on Symbolic and Algebraic Computation|url=https://dl.acm.org/doi/abs/10.1145/3373207.3404058|access-date=2020-07-29|website=dl.acm.org|language=EN|doi=10.1145/3373207.3404058}}</ref>
Another commonly used model has words with log ''n'' bits, where ''n'' is a variable depending on the input. For example, in [[graph algorithm]]s, it is typical to assume that the vertices are numbered 1 through ''n'' and that each memory cell can hold any of these values, so that they can refer to any vertex. This is justified in problems where the input uses Ω(''n'') words of storage, since on real computers, the memory cell and register size is typically selected in order to be able to index any word in memory. Operations on these words, such as copies and arithmetic operations, are assumed to operate in constant time, rather than O(log ''n'') time. The number of word operations required to solve a problem in this model is sometimes called its '''word complexity'''.
Line 25:
[[Oracle machine]]s are machines that have a specific operation that they can perform in constant time; this operation can be arbitrarily complex and can dramatically affect the complexity of algorithms performed on the machine. For example, if one has an oracle to solve any [[NP-complete]] problem, then any problem in '''[[NP (complexity)|NP]]''' can be solved in polynomial time (whereas without the oracle, no polynomial-time algorithm is known for many of these problems). Oracle machines are impractical to construct but useful in theory for determining which proof techniques will be effective.
== Metric being measured ==
Line 32 ⟶ 31:
Because disk operations are orders of magnitude slower than accesses to main memory, the typical metric used in disk-based algorithms is the number of disk seeks or the number of blocks read from the disk, which depend on both the input and the parameters of the disk. RAM accesses and CPU operations are "free." Similarly, in many models used to study data structures, such as the [[Cache-oblivious algorithm|cache-oblivious model]], operations on cached data are considered "free" because they are typically orders of magnitude faster in practice than access to main memory. Consequently, the typical metric used is the number of cache misses, which depends on both the input and parameters of the cache.
==References==
{{Reflist}}
[[Category:Computational complexity theory]]
|