Processor design: Difference between revisions

Content deleted Content added
Line 21:
As late as 1970, major computer languages such as "[[C_language|C]]" were unable to standardize their numeric behavior because decimal computers had groups of users too large to alienate.
 
Even when designers used a binary system, they still had many odd ideas. Some used sign-magnitude arthmetic (-1 = 10001), rather than modern [[two's complement]] arithmetic (-1 = 11111). Most computers used six-bit character sets, because they adequately encoded [[Hollerith]] cards. It was a major revelation to designers of this period to realize that the data word should be a multiple of the character size. They began to design computers with 12, 24 and 36 bit data words (e.g. see the [[TX-2]]).
 
In this era, [[Grosch's law]] dominated computer design: Computer cost increased as the square of its speed.