Content deleted Content added
m mv refs down |
Shorespirit (talk | contribs) Clarified order of original Coppersmith-Winograd method and cleaned up citations |
||
Line 1:
In [[linear algebra]], the '''Coppersmith–Winograd algorithm''', named after [[Don Coppersmith]] and [[Shmuel Winograd]], was the asymptotically fastest known [[algorithm]] for square [[matrix multiplication]] until 2010. It can multiply two <math>n \times n</math> matrices in <math>O(n^{2.
Computation, 9(3):251–280, 1990, doi:10.1016/S0747-7171(08)80013-2.</ref> (see [[Big O notation]]).
This is an improvement over the naïve <math>O(n^3)</math> time algorithm and the <math>O(n^{2.807})</math> time [[Strassen algorithm]]. Algorithms with better asymptotic running time than the Strassen algorithm are rarely used in practice.
It is possible to improve the exponent further; however, the exponent must be at least 2 (because an <math>n \times n</math> matrix has <math>n^2</math> values, and all of them have to be read at least once to calculate the exact result).
In 2010, Andrew Stothers gave an improvement to the algorithm, <math>O(n^{2.3736}).</math><ref>{{Citation | last1=Stothers | first1=Andrew | title=On the Complexity of Matrix Multiplication | url=http://www.maths.ed.ac.uk/pg/thesis/stothers.pdf | year=2010}}.</ref> In 2011, Virginia Williams combined a mathematical short-cut from Stothers' paper with her own insights and automated optimization on computers, improving the bound to <math>O(n^{2.3727}).</math><ref>{{Citation | last1=Williams | first1=Virginia | title=Breaking the Coppersmith-Winograd barrier | url=http://www.cs.berkeley.edu/~virgi/matrixmult.pdf | year=2011}}</ref>
Line 18 ⟶ 17:
* [[Gauss–Jordan elimination]]
* [[Strassen algorithm]]
{{Numerical linear algebra}}
|