History of computing hardware: Difference between revisions

Content deleted Content added
No edit summary
Tag: Reverted
m Reverted 1 edit by 122.54.168.145 (talk) to last revision by GreenC bot
Line 5:
The '''history of computing hardware''' covers the developments from early simple devices to aid [[calculation]] to modern day [[computer]]s.
 
The first aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementary [[arithmetic]] operation, then manipulate the device to obtain the result. Later, computers represented numbers in a continuousmlmalal,continuous form (e.g. distance along a scale, rotation of a shaft, or a [[voltage]]). Numbers could also be represented in the form of digits, automatically manipulated by a mechanism. Although this approach generally required more complex mechanisms, it greatly increased the precision of results. The development of [[transistor]] technology and then the [[integrated circuit]] chip led to a series of breakthroughs, starting with transistor computers and then integrated circuit computers, causing digital computers to largely replace [[analog computer]]s. [[MOSFET|Metal-oxide-semiconductor]] (MOS) [[large-scale integration]] (LSI) then enabled [[semiconductor memory]] and the [[microprocessor]], leading to another key breakthrough, the miniaturized [[personal computer]] (PC), in the 1970s. The cost of computers gradually became so low that personal computers by the 1990s, and then [[mobile computing|mobile computers]] ([[smartphone]]s and [[tablet computer|tablets]]) in the 2000s, became ubiquitous.
 
==Early devices==