THE FIRST GENERATION OF
COMPUTERS
The computer as we know it today had its beginning with a 19th century English mathematics professor name Charles Babbage.He designed the Analytical Engine and it was this design that the basic framework of the computers of today are based on.The history of computer develops every now and then. Computers become so efficient as it develops everyday. The 5 generations of computer that will be discuss below shows the development of computers as the generation change.Generally speaking, computers can be classified into three generations. Each generation lasted for a certain period of time,and each gave us either a new and improved computer or an improvement to the existing computer.
First Generation (1940-1956) Vacuum Tubes
The first computers used vacuum tubes for circuitry and magnetic drums for memory,
and were often enormous, taking up entire rooms. They were very
expensive to operate and in addition to using a great deal of
electricity, generated a lot of heat, which was often the cause of
malfunctions.
First generation computers relied on machine language,
the lowest-level programming language understood by computers, to
perform operations, and they could only solve one problem at a time.
Input was based on punched cards and paper tape, and output was
displayed on printouts.
The first generation computer is very hard to use for it cause great deal of electricity and it also generates a lot of heat and if it is not applied there comes the malfunctioning of the system.
HOW IT WORKS?
The filament was contained in an incandescent light bulb with an
additional plate. When the filament was heated, the electrons emitted
from its surface into the vacuum inside the bulb. There was a plate
enveloping the filament towards which these electrons would move. The
filament (cathode) is hot and the electrode plate (anode) is cold. This
helped the movement of the electrons.
Second Generation (1956-1963) Transistors
The transistor was far superior to the vacuum tube, allowing computers
to become smaller, faster, cheaper, more energy-efficient and more
reliable than their first-generation predecessors. Though the transistor
still generated a great deal of heat that subjected the computer to
damage, it was a vast improvement over the vacuum tube.
Second-generation computers still relied on punched cards for input and
printouts for output.
HOW IT WORKS?
If cells are the building blocks of life, transistors are the
building blocks of the digital revolution. Without transistors, the
technological wonders you use every day -- cell phones, computers, cars -- would be vastly different, if they existed at all.Before transistors, product engineers used vacuum tubes and electromechanical switches to complete electrical circuits.
Tubes were far from ideal. They had to warm up before they worked (and
sometimes overheated when they did), they were unreliable and bulky and
they used too much energy. Everything from televisions, to telephone
systems, to early computers used these components, but in the years
after World War II, scientists were looking for alternatives to vacuum
tubes. They'd soon find their answer from work done decades earlier.
Third Generation (1964-1971) Integrated Circuits
The development of the Integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications
at one time with a central program that monitored the memory. Computers
for the first time became accessible to a mass audience because they
were smaller and cheaper than their predecessors.
HOW IT WORKS?
The discovery of SEMICONDUCTORS,
the invention of transistors and the creation of the integrated circuit
are what make Moore's Law -- and by extension modern electronics --
possible. Before the invention of the transistor, the most widely-used
element in electronics was the vacuum tube. Electrical
engineers used vacuum tubes to amplify electrical signals. But vacuum
tubes had a tendency to break down and they generated a lot of heat,
too.
Fourth Generation (1971-Present) Microprocessors
The microprocessor
brought the fourth generation of computers, as thousands of integrated
circuits were built onto a single silicon chip. What in the first
generation filled an entire room could now fit in the palm of the hand.
The Intel 4004 chip, developed in 1971, located all the components of
the computer—from the central processing unit and memory to input/output controls—on a single chip. As these small computers became more powerful, they could be linked
together to form networks, which eventually led to the development of
the Internet.
HOW IT WORKS?
The microprocessor is the heart of any normal computer,A microprocessor -- also known as a CPU
or central processing unit -- is a complete computation engine that is
fabricated on a single chip. The first microprocessor was the Intel
4004, introduced in 1971. The 4004 was not very powerful -- all it could
do was add and subtract, and it could only do that 4 bits
at a time. But it was amazing that everything was on one chip. Prior to
the 4004, engineers built computers either from collections of chips or
from discrete components (transistors wired one at a time). The 4004 powered one of the first portable electronic calculators.
Fifth Generation (Present and Beyond) Artificial Intelligence
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of paral and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology
will radically change the face of computers in years to come. The goal
of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.
HOW IT WORKS?
There are various forms of artificial intelligence (AI) out there today. It is a tough question what to even call an artificial intelligence
and what to merely call a software program. There is a tendency in
software, where when something that used to be called "AI" matures and
integrates itself into the technological backdrop, it doesn't get called
AI anymore.
Computers have leapfrogged the human society into another league.
It is used in each and every aspect of human life. They will spearhead
the human quest of eradicating social problems like illiteracy and
poverty. It is difficult to imagine a world bereft of computers. This
revolutionary technology is indeed a boon to the human race. May
computers continue to shower their blessings to us.