Computational complexity of matrix multiplication: Difference between revisions

Content deleted Content added
Npip99 (talk | contribs)
Wording it better
made algorithm plural
Line 259:
There is a trivial lower bound of {{tmath|\omega \ge 2}}. Since any algorithm for multiplying two {{math|''n'' × ''n''}}-matrices has to process all {{math|2''n''<sup>2</sup>}} entries, there is a trivial asymptotic lower bound of {{math|Ω(''n''<sup>2</sup>)}} operations for any matrix multiplication algorithm. Thus {{tmath|2\le \omega < 2.37188}}. It is unknown whether {{tmath|\omega > 2}}. The best known lower bound for matrix-multiplication complexity is {{math|Ω(''n''<sup>2</sup> log(''n''))}}, for bounded coefficient [[Arithmetic circuit complexity|arithmetic circuits]] over the real or complex numbers, and is due to [[Ran Raz]].<ref>{{cite book | last1 = Raz | first1 = Ran | title = Proceedings of the thiry-fourth annual ACM symposium on Theory of computing | chapter = On the complexity of matrix product | author-link = Ran Raz | year = 2002| pages = 144–151 | doi = 10.1145/509907.509932 | isbn = 1581134959 | s2cid = 9582328 }}</ref>
 
The exponent ω is defined to be a [[Accumulation point|limit point]], in that it is the infimum of the exponent over all matrix multiplication algorithmalgorithms. It is known that this limit point is not achieved. In other words, under the model of computation typically studied, there is no matrix multiplication algorithm that uses precisely {{math|O(''n''<sup>ω</sup>)}} operations; there must be an additional factor of {{math|''n''<sup>o(1)</sup>}}.<ref name="CW.81"/>
 
=== Rectangular matrix multiplication ===