Commodity computing: Difference between revisions

Content deleted Content added
.V. (talk | contribs)
.V. (talk | contribs)
m Ugh, that took a bit of sorting out.
Line 1:
'''Commodity computing''' is computing done on commodity computers as opposed to supermicrocomputers or boutique computers. Commodity computers are [[computer system]]s manufactured by multiple vendors, incorporated components based on [[open standard]]s. Such systems are said to be based on [[commodity]] components since the standardization process promotes lower costs and less differentiation among vendor's products.
 
=== History ===
=== The Mid-1960s to Early 1980s ===
The first computers were large, expensive, complex and proprietary. The move towards commodity computing began when [[Digital Equipment Corporation|DEC]] introduced the [[PDP-8]] in 1965. This was a computer that was relatively small and inexpensive enough that a department could purchase one without convening a meeting of the board of directors. The entire [[minicomputer]] industry sprang up to supply the demand for 'small' computers like the PDP-8. Unfortunately, each of the many different brands of minicomputers had to stand on their own because there was no software and very little hardware compatibility between them.
 
Line 9:
This process accelerated in 1977 with the introduction of the first commodity - like microcomputer, the [[Apple II]]. With the development of the [[Visicalc]] application in 1979, microcomputers broke out of the factory and began entering office suites in large quantities, but still through the back door.
 
=== The 1980s to Mid-1990s ===
The [[IBM PC]] was introduced in 1981 and immediately began displacing Apple II's in the corporate world, but commodity computing as we know it today truly began when [[Compaq]] developed the first true IBM PC compatible. More and more PC compatible microcomputers began coming into big companies through the front door and commodity computing was well established.