History and Evolution of Computers

Image Source: padaku.com


How Computers Evolved?

Computers have come a long way in the decades since they were first invented. Many years ago, computers were big and slow in their most basic form. Computers have become smaller and faster in recent years, allowing people to use them almost anywhere.

A computer simply means something that computes. The need for computers was to perform bigger calculations and store large amounts of information that a simple calculator cannot, and this need has been there since the inception of human intelligence.

The earliest known computer was a simple tool known as ABACUS that contained parallel rods on which different numbers of beads were attached. This device was simply used for addition and subtraction. From that small calculating device to modern-day supercomputers, there has been a massive change in computer technology.

Now, the availability of notebook-sized portable computers to use anywhere is simply the pinnacle of modern technology. So, let us go back in time and see how computers evolved through the ages.

Some initial developments happened in Europe from the 16th-17th century. In 1642, French scientist Blaise Pascal invented the mechanical machine capable of addition of bigger numbers with good efficiency, thereby establishing him as one of the first two inventors of mechanical calculators.

Motivated by this adding machine, German mathematician Gottfried Wilhelm Leibniz designed a better model that along with addition can perform multiplication, division, and even square roots of smaller numbers.

Finally, in 1820, Charles Xavier Thomas produced the first commercially available mechanical calculator. It was Charles Babbage who developed the first modern equivalent of a computer system in Cambridge, England. He built an automatic mechanical calculator known as a “Difference Engine” in 1822.


Mechanic Calculator
Mechanic Calculator


This mechanical calculator when subjected to programming, and made suitable for general purpose work was called an “Analytical Machine”. A mechanical computer consists of mechanical elements such as levers and gears, unlike today’s electronic components.

For example, one complete rotation of the main shaft in a different engine designed by Charles Babbage was equal to one full set of addition. The engine consists of several columns, numbered from 1 to N.


Mechanical Enjine


Each column stores exactly one decimal number. The machine adds the value of a column n+1 to column n to produce the new value of n. Thus with the help of a gear system, rotating shaft, and sweep arms, calculations were performed by taking the values stored on each column.

These machines were as big as a large table. A major step in the evolution of computers came when Herman Hollerith and James Powers working for the US Census Bureau invented the punched cards.


Punch card
Punch card


A punched card is a piece of stiff paper that holds digital data represented by the presence or absence of holes in predefined positions. With the help of punch cards, the programming code can be stored in those computers, making the mechanical operations fully automatic.

In 1896, Hollerith formed the Tabulating Machine Company which started manufacturing the computers based on the punched card. Later Thomas J. Watson became the president of the company and it was renamed as International Business Machines (IBM) Corporation in 1924.

This company played the most significant role in the evolution of computer systems. IBM produced the International Business Machines Automatic Sequence Controlled Calculator or Harvard Mark I in 1944, and it was the very first information-processing machine.


The IBM ASCC/Harverd Mark 1


It contained 765,000 wheels, 500 miles of wire, and a 51ft long and 8ft high panel. Each input data was entered through a punch card and the output was recorded by an electric typewriter. This was the first attempt at an “electro-mechanical analytical machine”.

Later the first all-digital electronic computer was produced during World War 2. Because of the heavy calculations required in building the nuclear weapon, a more efficient computing machine was required. In 1946, a new device, Electronic Numerical Integrator and Computer (ENIAC) were completed.

Read More: The Manhattan Project and the Invention of the Atomic Bomb

It was a fully digital computer and used vacuum tubes for its functioning. It was 1000 times faster than that of electromechanical computers. One ENIAC could replace 2400 humans since it calculated a trajectory in 30 seconds that would take a human 20 hours.

ENIAC was succeeded by EDVAC i.e Electronic Discrete Variable Automatic Computer which has also a storage facility and can store a program for the automatic functioning of the computer.

UNIVAC-1 that is Universal Automatic Computer became the most popular digital machine of that time. It was produced by Eckert-Mauchly Computer Corporation in 1951. It also successfully predicted the presidential election of the USA in 1952 in which the result predicted Dwight Eisenhower’s election victory over his opponent.

This machine started the generation of computers to come. The first generation of computers is said to be the period between 1950 and 1959. Computers used vacuum tubes for logical operations to be performed and ring-shaped ferrite cores for memory storage.

These computers were bulky and very expensive. ENIAC, EDVAC, UNIVAC-I and II, IBM 702, and IBM 650 were some famous computers of the first generation.

The second-generation computers from 1959 to 1969 used semiconductor digital elements and were faster & more reliable than the 1st generation. They were also smaller and less expensive. They used magnetic tape for the storage application. UNIVAC-III, Honeywell 400, and 800 were some popular new additions in the second generation.

The third generation from 1969 to 1977 started with the use of Integrated Circuits (IC). This was the era of microprocessors! The ICs were invented in 1958, and since then, the small sizes and the ability to perform complex operations helped their usage in computers.

It was possible to place 100s of ICs in a tiny silicon chip. The sizes of computers were drastically reduced. IBM 360, 370, UNIVAC 1108, Honeywell 200 series were important members of this generation. With the availability of Very Large Scale Integration (VLSI), i.e a process to create an IC by combining millions of transistors in a single chip, the fourth generation of computers started in the 1980s.

The circuit density was greatly increased enabling the computers to perform multiple operations at a very efficient speed. They used semiconductors for the memory units. The prices were dropped, sizes were reduced and this led to the introduction of personal computers for their use in schools, colleges, and government offices.

IBM and Apple Computers played a significant role in this new revolution. The introduction of the Macintosh revolutionized the market for computers. It was the first successful desktop personal computer to have a graphical user interface, a built-in screen, and a mouse. Production of portable PCs started in the 80s itself.

In 1981, Osborne 1 was released which was a luggable computer but weighed 11 kilograms. Hewlett Packard (HP) also started producing portables during that period. First laptops were produced in the 1980s with the flip form factor. Displays reached 640X480 resolutions in 1988 and by 1991 color screens were also attached to the portable computers. Thus slowly, modern laptops came into being.

Now the fifth generation of computers is being started that incorporated a wide range of new processes such as Artificial intelligence involving robotics and simulations. In this article, we covered a brief history of the evolution of computers. In the next article, we will discuss the future of computers and the next generations. So what are your thoughts about the history of computers? Let us know in the comments. If you found this article interesting, please share it with your friends.



Leave a Comment