Commodity computing: Difference between revisions

Content deleted Content added
Rilak (talk | contribs)
The Mid-1960s to Early 1980s: The first computers were not complex - a few thousand vacuum tubes is not complex if compared to commodity microprocessors with one billion transistors.
Rilak (talk | contribs)
The 1980s to Mid-1990s: "Microprocessor" and "microcomputer" are not interchangeable.
Line 15:
During the 1980s microcomputers began displacing "real" computers in a serious way. At first, price was the key justification but by the mid 1980s, semiconductor technology had evolved to the point where microprocessor performance began to eclipse the performance of discrete logic designs. These traditional designs were limited by speed-of-light delay issues inherent in any CPU larger than a single chip, and performance alone began driving the success of microprocessor-based systems.
 
The old processor architectures began to fall, first minis, then [[supermini]]s, and finally [[Mainframe computer|mainframes]]. By the mid 1990s, every computer made waswere based on a microprocessormicroprocessors, and mostthe majority of general purpose microprocessors were microcomputersimplementations compatibleof withthe IBMx86 PCISA. Although there was a time when every traditional computer manufacturer had its own proprietary micro-based designs there are only a few manufacturers of non-commodity computer systems today. However, super microcomputers (large-scale computer systems based on one or more microprocessors, like those of the IBM p, i, and z series) still own the high-end of the market.
 
== Commodity Computing in the Present Day ==