{{History of computing}}{{In use|time=15:19, 30 October 2024 (UTC)}}
The '''history of computing hardware'''coversspans the developments from early devices used for simple devicescalculations to aidtoday’s [[calculation]]complex tocomputers, encompassing advancements in both analog modernand daydigital [[computer]]stechnology.
The first aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementary [[arithmetic]] operation, then manipulate the device to obtain the result. LaterIn later stages, computerscomputing representeddevices began representing numbers in a continuous formforms, (e.g.such as by distance along a scale, rotation of a shaft, or a [[specific voltage]]) level. Numbers could also be represented in the form of digits, automatically manipulated by a mechanism. Although this approach generally required more complex mechanisms, it greatly increased the precision of results. The development of [[transistor]] technology, andfollowed thenby the [[invention of integrated circuit]]chipchips, led to a series ofrevolutionary breakthroughs,. starting with transistorTransistor-based computers and, thenlater, integrated circuit-based computers,causingenabled digital computerssystems to largelygradually replace [[analog computer]]ssystems, increasing both efficiency and processing power. [[MOSFET|Metal-oxide-semiconductor]] (MOS) [[large-scale integration]] (LSI) then enabled [[semiconductor memory]] and the [[microprocessor]], leading to another key breakthrough, the miniaturized [[personal computer]] (PC), in the 1970s. The cost of computers gradually became so low that personal computers by the 1990s, and then [[mobile computing|mobile computers]] ([[smartphone]]s and [[tablet computer|tablets]]) in the 2000s, became ubiquitous.