Talk:Shor's algorithm: Difference between revisions

Content deleted Content added
Line 71:
 
:Any reference to these concerns? Also how many qbits does it take factor 21 and what is the maximum number of qbits have been implemented in a quantum computer? [[User:C-randles|crandles]] 15:16, 21 September 2006 (UTC)
 
:It takes 2n qubits to factor a number n bits long. 21 requires 5 bits to represent and therefore would require a quantum computer with atleast 10 qubits to factor. [[User:198.37.27.79|198.37.27.79]] 03:04, 27 October 2006 (UTC)