Content deleted Content added
m mu not micro per MOS:NUM#Specific units and Unicode compatibility characters (via WP:JWB) |
→RISC enters: rewrite sentences for clarity |
||
Line 91:
===RISC enters===
While companies continued to compete on the complexity of their instruction sets, and the use of microcode to implement these was unquestioned, in the mid-1970s an internal project in IBM was raising serious questions about the entire concept. As part of a project to develop a high-performance all-digital [[telephone switch]], a team led by [[John Cocke (computer scientist)|John Cocke]] began examining huge volumes of performance data from their customer's 360 (and [[System/370]]) programs. This led them to notice a curious pattern: when the ISA presented multiple versions of an instruction, the [[compiler]] almost always used the simplest one, instead of the one most directly representing the code. They learned that this was because those instructions were always implemented in hardware, and thus run the fastest. Using the other instruction might offer higher performance on some machines, but there was no way to know what machine they were running on
| last1 = Cocke | first1 = John
| last2 = Markstein | first2 = Victoria
|