Coppersmith–Winograd algorithm: Difference between revisions

Content deleted Content added
m General Fixes + MOS + DMY changes using AWB
Line 1:
{{Use dmy dates|date=July 2013}}
In [[linear algebra]], the '''Coppersmith–Winograd algorithm''', named after [[Don Coppersmith]] and [[Shmuel Winograd]], was the asymptotically fastest known [[algorithm]] for square [[matrix multiplication]] until 2010. It can multiply two <math>n \times n</math> matrices in <math>O(n^{2.375477})</math> time <ref name="coppersmith">Don{{Citation|doi=10.1016/S0747-7171(08)80013-2|title=Matrix Coppersmithmultiplication andvia Shmuelarithmetic Winograd. [progressions|url=http://www.cs.umd.edu/~gasarch/ramsey/matrixmult.pdf|year=1990|last1=Coppersmith|first1=Don|last2=Winograd|first2=Shmuel|journal=Journal Matrixof MultiplicationSymbolic viaComputation|volume=9|issue=3|pages=251}}</ref> (see [[Big ArithmeticO Progressionsnotation]]). J. Symbolic
Computation, 9(3):251–280, 1990, doi:10.1016/S0747-7171(08)80013-2.</ref> (see [[Big O notation]]).
This is an improvement over the naïve <math>O(n^3)</math> time algorithm and the <math>O(n^{2.807})</math> time [[Strassen algorithm]]. Algorithms with better asymptotic running time than the Strassen algorithm are rarely used in practice.
It is possible to improve the exponent further; however, the exponent must be at least 2 (because an <math>n \times n</math> matrix has <math>n^2</math> values, and all of them have to be read at least once to calculate the exact result).