Content deleted Content added
Dedhert.Jr (talk | contribs) Undid revision 1218042027 by 106.220.90.222 (talk) |
m Add master theorem wiki link in see also section |
||
(5 intermediate revisions by 5 users not shown) | |||
Line 1:
{{Short description|Amount of resources to perform an algorithm}}
{{CS1 config|mode=cs2}}
{{
In [[computer science]], the '''computational complexity''' or simply '''complexity''' of an [[algorithm]] is the amount of resources required to run it. Particular focus is given to [[time complexity|computation time]] (generally measured by the number of needed elementary operations) and [[space complexity|memory storage]] requirements. The complexity of a [[computational problem|problem]] is the complexity of the best algorithms that allow solving the problem.▼
In [[computer science]], the '''computational complexity''' or simply '''complexity''' of an [[algorithm]] is the amount of resources required to run it.<ref>{{citation
| last = Vadhan | first = Salil | author-link = Salil Vadhan
| editor1-last = van Tilborg | editor1-first = Henk C. A.
| editor2-last = Jajodia | editor2-first = Sushil
| contribution = Computational Complexity
| contribution-url = https://people.seas.harvard.edu/~salil/research/ComputationalComplexity-2ndEd.pdf
| doi = 10.1007/978-1-4419-5906-5_442
| isbn = 9781441959065
| pages = 235–240
| publisher = Springer
| title = Encyclopedia of Cryptography and Security
▲
The study of the complexity of explicitly given algorithms is called [[analysis of algorithms]], while the study of the complexity of problems is called [[computational complexity theory]]. Both areas are highly related, as the complexity of an algorithm is always an [[upper bound]] on the complexity of the problem solved by this algorithm. Moreover, for designing efficient algorithms, it is often fundamental to compare the complexity of a specific algorithm to the complexity of the problem to be solved. Also, in most cases, the only thing that is known about the complexity of a problem is that it is lower than the complexity of the most efficient known algorithms. Therefore, there is a large overlap between analysis of algorithms and complexity theory.
As the amount of resources required to run an algorithm generally varies with the size of the input, the complexity is typically expressed as a function {{math|''n'' → ''f''(''n'')}}, where {{math|''n''}} is the size of the input and {{math|''f''(''n'')}} is either the [[worst-case complexity]] (the maximum of the amount of resources that are needed over all inputs of size {{math|''n''}}) or the [[average-case complexity]] (the average of the amount of resources over all inputs of size {{math|''n''}}). [[Time complexity]] is generally expressed as the number of required elementary operations on an input of size {{math|''n''}}, where elementary operations are assumed to take a constant amount of time on a given computer and change only by a constant factor when run on a different computer. [[Space complexity]] is generally expressed as the amount of [[computer memory|memory]] required by an algorithm on an input of size {{math|''n''}}.
Line 28 ⟶ 40:
{{anchor|bit complexity}}
For many algorithms the size of the integers that are used during a computation is not bounded, and it is not realistic to consider that arithmetic operations take a constant time. Therefore, the time complexity, generally called '''bit complexity''' in this context, may be much larger than the arithmetic complexity. For example, the arithmetic complexity of the computation of the [[determinant]] of a {{math|''n''×''n''}} [[integer matrix]] is <math>O(n^3)</math> for the usual algorithms ([[Gaussian elimination]]). The bit complexity of the same algorithms is [[exponential function|exponential]] in {{mvar|n}}, because the size of the coefficients may grow exponentially during the computation. On the other hand, if these algorithms are coupled with [[modular arithmetic|multi-modular arithmetic]], the bit complexity may be reduced to {{math|[[soft O notation|''O''<sup>~</sup>(''n''<sup>4</sup>)]]}}.
In [[sorting]] and [[search algorithm|searching]], the resource that is generally considered is the number of entry comparisons. This is generally a good measure of the time complexity if data are suitably organized.
Line 98 ⟶ 110:
* [[Computational complexity of mathematical operations]]
* [[Chinese Postman Problem Complexity List]]
* [[Master theorem (analysis of algorithms)]]
==References==
{{Reflist}}
*{{Citation
| last1=Arora | first1=Sanjeev | author-link1=Sanjeev Arora
Line 170 ⟶ 184:
|isbn=0-534-95097-3
}}
{{Computer science}}
[[Category:Analysis of algorithms]]
[[Category:Computational complexity theory]]
|