Content deleted Content added
Fixed a dead link to the Williams article |
m Fixed a broken link |
||
Line 1:
{{Use dmy dates|date=July 2013}}
In [[linear algebra]], the '''Coppersmith–Winograd algorithm''', named after [[Don Coppersmith]] and [[Shmuel Winograd]], was the asymptotically fastest known [[algorithm]] for square [[matrix multiplication]] until 2010. It can multiply two <math>n \times n</math> matrices in <math>O(n^{2.375477})</math> time <ref name="coppersmith">{{Citation|doi=10.1016/S0747-7171(08)80013-2|title=Matrix multiplication via arithmetic progressions|url=http://www.cs.umd.edu/~gasarch/TOPICS/ramsey/matrixmult.pdf|year=1990|last1=Coppersmith|first1=Don|last2=Winograd|first2=Shmuel|journal=Journal of Symbolic Computation|volume=9|issue=3|pages=251}}</ref> (see [[Big O notation]]).
This is an improvement over the naïve <math>O(n^3)</math> time algorithm and the <math>O(n^{2.807})</math> time [[Strassen algorithm]]. Algorithms with better asymptotic running time than the Strassen algorithm are rarely used in practice.
It is possible to improve the exponent further; however, the exponent must be at least 2 (because an <math>n \times n</math> matrix has <math>n^2</math> values, and all of them have to be read at least once to calculate the exact result).
|