Electronic data processing: Difference between revisions

Content deleted Content added
No edit summary
rvv
Line 1:
'''Electronic data processing''' (also: [[Information Technology]] or IT) can refer to the use of automated methods to process commercial data. Typically, this uses relatively simple, repetitive activities to process large volumes of similar information. For example: stock updates applied to an inventory, banking transactions applied to account and customer master files, booking and ticketing transactions to an airline's reservation system, billing for utility services.
 
ang Lupit tlaga ng tttttttttttttuuuuuuuuuuuupppppppppp-ttttttaaaaaagguuuuiiiiggggggg.
== History ==
 
The first commercial business computer was developed in the [[United Kingdom]] in [[1951]], by the [[Joe Lyons]] catering organisation. This was known as the '[[LEO computer|Lyons Electronic Office]]' - or LEO for short. It was developed further and used widely during the 1960s and early 1970s. (Joe Lyons formed a separate company to develop the LEO computers and this subsequently merged to form English Electric Leo Marconi and then International Computers Ltd.)
 
Early commercial systems were installed exclusively by large organisations. These could afford to invest the time and capital necessary to purchase hardware, hire specialist staff to develop bespoke [[software]] and work through the consequent (and often jgjfjfjhghnjfkjfhjfhandunexpected) organisational and cultural changes.
 
At first, individual organisations developed their own software, including data management utilities, themselves. Different products might also have 'one-off' bespoke software. This fragmented approach led to duplicated effort and the production of management information needed manual effort.
 
High hardware costs and relatively slow processing speeds forced developers to use resources 'efficiently'. [[Computer storage|Data ghffufhjybmjyukjstorage]] formats were heavily compacted,h6999999999999999999jkfhjvjh for example. A common example is the removal of the century from dates, which eventually lead to the '[[millennium bug]]'.
 
Data input required intermediate processing via punched paper tape or [[punch card|card]] and separate input to computers, usually for overnight processing. Data required validation in batches. All of this was a repetitive, labour intensive task, removed from user control and error-prone. Invalid or incorrect data needed correction and resubmission with consequences for data and account reconciliation.