2. Computer Generations
Generation in computer terminology is a
change in technology a computer is/was
being used. Initially, the generation term
was used to distinguish between varying
hardware technologies. But nowadays,
generation includes both hardware and
software, which together make up an entire
computer system.
3. Hardware
• The period of third generation was 1964-
1972.
• The third generation of computer is marked
by the use of Integrated Circuits (IC's) in
place of transistors .A single I.C has many
transistors, resistors and capacitors along
with the associated circuitry . The I.C was
invented by Jack Kilby .This development
made computers smaller in size, reliable and
efficient.
WHEN?
4. Jack St. Clair Kilby (November 8, 1923 – June 20, 2005)
invented the first Integrated Circuit
WHO?
7. Hundreds or even thousands of transistors could be fabricated on a
single wafer of silicon. In addition, these fabricated transistors could be
connected to form logic circuits on the same chip.
Silicon WafersSilicon chips
8.
9. These I.Cs are more compact than a transistor. Single I.Cs has many
transistors, registers, capacitors, placed on a single this of silicon. So
the computer built of such components become smaller. Some of
the computers developed during this period were:
•IBM 360 – developed by IBM in 1964
14. •TDC 316- 1975
- 16-bit computers. The data used to be recorded in a
punch card, which was then read by a card reading machine,
which in turn, used to give signals to the main frame computer.
15. Software
• In the beginning magnetic core memories were
used. Later they were replaced by semiconductor
memories (RAM & ROM)
• In this generation size of main memory reached
about 100mb.
• Introduced microprogramming
• Microprogramming, parallel processing (pipelining,
multiprocessor system etc), multiprogramming,
multi-user system (time shared system) etc were
introduced.
HOW?
16. • Operating system software were introduced (efficient sharing of a
computer system by several user programs)
• Cache and virtual memories were introduced (Cache memory makes
the main memory appear faster than it really is. Virtual memory makes it
appear larger)
•High level languages were standardized by ANSI eg. ANSI FORTRAN, ANSI
COBOL etc
• Database management, multi-user application, online systems like
closed loop process control, airline reservation, interactive query systems,
automatic industrial control etc emerged during this period.
17. Microprogramming
microprogramming A method of accomplishing the control
unit function by describing the steps in that function as a sequence of
register-transfer level operations that are much more elementary
than instructions. In this method of designing and building a control unit,
an additional memory, commonly called a microprogram store,
contains a sequence of microinstructions. A number of microinstructions
will be required to carry out an ordinary machine instruction, thus the
microprogram store should be faster – have a shorter cycle time – than
the normal fast memory.
18. COBOL PL
COBOL (Common Business Oriented Language) was one of the
earliest high-level programming languages. It was developed in
1959 by a group of computer professionals called the Conference
on Data Systems Languages (CODASYL).
•The language that automated business
•Allows names to be truly connotative - permits both long names
(up to 30 characters) and word-connector characters (dashes)
•Offers object, visual programming environments
Class Libraries
•Integration with the World Wide Web
20. FORTRAN PL
One of the oldest programming languages, the FORTRAN was
developed by a team of programmers at IBM led by John Backus,
and was first published in 1957. The name FORTRAN is an acronym
for FORmula TRANslation, because it was designed to allow easy
translation of math formulas into code.
•Simple to learn
•Machine Independent - allows for easy transportation of a
program from one machine to another.
•More natural ways to express mathematical functions - FORTRAN
permits even severely complex mathematical functions to be
expressed similarly to regular algebraic notation.
21.
22. Advantages
• Smaller in size as compared to
previous generations.
• More reliable.
• Used less energy
• Produced less heat as compared to
the previous two generations of
computers.
• Better speed and could calculate
data in nanoseconds.
• Used fan for heat discharge to
prevent damage.
• Maintenance cost was low because
hardware failure is rare.
• Totally general purpose
• Could be used for high-level
languages.
• Good storage
• Versatile to an extent
• Less expensive
• Better accuracy
• Commercial production increased.
• Used mouse and keyboard for input.
Disadvantages
• Air conditioning was required.
• Highly sophisticated technology
required for the manufacturing of IC
chips.