The first computers
Originally the word computer was used for a person who did manual calculations (or computations). Starting from the early 1900s the word computer started to be used for calculating machines as well. The first computing machines were mechanical calculators. But computers as we know them have two specific properties: they calculate, and they are programmable. Programmable computers only became feasible after the invention of punched cards, which allowed computers to process batches of data.
The British Colossus computer , created during World War II, was the world's first programmable computer. Its status was never recognized publically, however, because information about it was classified under British secrecy laws.
The first publically recognized general purpose computer was the ENIAC (Electronic Numerical Integrator And Computer). The ENIAC was designed in 1943 and was financed by the United States Army in the midst of World War II. The machine was finished and in full operation in 1946 (after the war) and was in continuous operation until 1955. While the purpose of ENIAC was to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory, it was actually used first to perform calculations for the design of the hydrogen bomb.
The ENIAC could perform 5,000 operations per second, which was spectacular at the time. However, it used more than 17,000 vacuum tubes, each with a limited life span, which made it highly unreliable. The ENIAC got its input using an IBM punched card reader, and punched cards were used for output as well.
As a result of the invention of the transistor in 1956, in the 1960s computers started to be built using transistors instead of vacuum tubes. Transistor-based machines were smaller, faster, cheaper to produce, required less power, and were much more reliable.
The transistor based computers were followed in the 1970s by computers based on integrated circuit (IC) technology. ICs are small chips that contain a set of transistors providing standardized building blocks like AND gates, OR gates, counters, adders, flip-flops, etc. By combining those building blocks, CPUs and memory circuits could be created.
The subsequent creation of microprocessors decreased size and cost of computers even further, and increased their speed and reliability. In the 1980s microprocessors were cheap enough to be used in home and personal computers.
This entry was posted on Saturday 11 September 2010