Definition:
Computer
is an electronic device which accepts data as input process it and generates
result in a form of an output or store it for later use.
INTRODUCTION
The original definition of the word "computer" was a
person who made calculations. This definition goes back to the 1600s and
extends midway through the 20th century, when the term "computer"
began to refer to a machine. The computer is based on the same concept as the
abacus, which goes back many centuries.
Abacus
It dates back to ancient times and
was invented by the Chinese. Ten beads were strung onto wires attached to a
frame. Addition and subtraction were read from the final positions of the
beads. It was considered the first manual tool used in calculating answers to
problems that provided information and in a primitive way storing the results.
Mechanical Clock
During the middle Ages the first
closed system in terms of calculating information was invented by use of a
mechanical clock. The parts of the clock calculated the time of day. The time
was displayed through the position of two hands on its face. The inventor
pre-programmed the clock instructions through the manner in which the pull of
the weights and the swing of the pendulum with the movement of the gears
established the position of the hands on the clock face.
Mathematics
John Napier (Scotsman mid 1600s) discovered
logarithms. He devised a system where he put the logarithms on a set of
ivory rods called "Napier’s Bones". By sliding the numbers up and
down he invented a very primitive slide rule. Robert Bissaker perfected the
system by placing numbers on sliding pieces of wood rather than ivory.
Blaise Pascal
(1642) developed the first real
calculator. Addition and subtraction were carried out by using a series of
very light rotating wheels. His system is still used today in car odometers
which track a car’s mileage.
Gottfried van Leibnitz
(German mathematician) In 1690
Leibnitz developed a machine that could add, subtract, multiply, divide, and
calculate square roots. The instructions were programmed into the machine.
Programming was accomplished through the use of gears. The drawback to this
machine was that the instructions could not be changed without changing the
whole machine.
Joseph Jacquard
(early 1800’s) Jacquard developed a
loom controlled by punched cards. The cards were made of cardboard which
were programmed with instructions. Each card represented a loop, and the
machine read the cards as they were passed over a series of rods. The loom was
the early ancestor of the IBM punched card.
Charles Babbage
(1812) Babbage was a genius of a man
who saw few of his inventions actually built. He designed and built a model of
what was called the difference engine. This invention was designed to
perform calculations without human intervention. The ultimate goal of the
machine was to have the machine calculate logarithm tables and print the
results. Babbage was so far ahead of the times that the technology was not in
place to manufacture the parts for his machine so he was only able to build a
small model. In 1833, Babbage then designed the analytic engine. This
machine had many of the same parts that could be found in modern day computers.
It had an arthmetic unit which performed calculations. Another part of the
computer was called the "store" which stored intermediate and final
results and instructions. This was completed for each stage of calculation. It
was to get its instructions from punched cards and worked through mechanical
means. The machine would be able to perform any calculation. Before the machine
could be made Babbage died. His son built a small model of the work that still
exists today. Babbage became known as the father of the modern day computers.
Dr. Herman Hollerith
(late 1800 statistician) Hollerith
used the punched card method to process data gathered in the census. The
previous census had taken seven years to complete because of the large amount
of data collected that needed to be processed. By developing the Hollerith
code and a series of machines which could store census data on cards, he
was able to accomplish the accounting of the census in two and a half years
with an additional two million pieces of data added. His code was able to sort
the data according to the needs of the United States Government. He was known
for developing the first computer card and accomplishing the largest
data processing endeavor undertaken at the time. Hollerith set up the
Tabulating Machine Company which manufactured and marketed punched cards and
equipment to the railroads. The railroads used the equipment to tabulate
freight schedules. In 1911, the Tabulating Machine Company merged with other
companies to form the International Business Machine Corporation (IBM).
William Burroughs
(late 1890’s) designed the mechanical
adding machine. The machine operated by way of a crank and was key driven.
The Burroughs Adding Machine Company was to become one of the giants of the
computer industry. His machine could record, calculate, and summarize. Today,
Burroughs has merged with UNISYS which builds computers.
The Years from 1900-1940
During the next forty years, more of
the adding, calculating, and tabulating machines were developed. Eventually the
machines evolved to a point where they could multiply, interpret the alphabetic
data, recordkeeping, and other accounting functions. They were called accounting
machines.
Howard Aiken
(1944) The Mark I ,through a
collaboration with Harvard University, IBM, and the U.S. War Department, was
developed to handle a large amount of number crunching. The complex equation
solving that was needed to map logistics in the military was the driving force
behind this project. ( The United States was at war with Germany.) The Mark
I was the first automatic calculator. It was not electronic, but did use
electromagnetic relays with mechanical counters. It was said that when it ran
the clicking sound was unbearable. Paper tape with hole punched in it provided
the instruction sets, and the output was returned through holes punched in
cards.
J. Presper Eckert and John W.
Mauchly
(ENIAC, 1946 University of
Pennsylvania) The ENIAC (Electronic Numerical Integrator and Calculator) was an
electronic computer sponsored by the war department. It was classified
because of war purposes. The ENIAC was so large that it took up a room ten feet
high by about ten feet wide and several hundred feet in length. It could
perform multiplication in the 3/1000 of a second range. There were 18,000
vacuum tubes in the machine and instructions had to be fed into the machine by
way of switches because there was no internal memory within the machine.
Jon Von Neumann
Late 1940’s devised a way to encode
instructions and data in the same language. This paved the way for computer
instructions to be stored in the computer itself. He was the forced behind the
development of the first stored-program computer.
A Race between the EDVAC and the
EDSAC
Two groups of individuals were
working at the same time to develop the first stored-program computer. In the
United States, at the University of Pennsylvania the EDVAC (Electronic Discrete
Variable Automatic Computer) was being worked on. In England at Cambridge, the
EDSAC (Electronic Delay Storage Automatic Computer) was also being developed.
The EDSAC won the race as the first stored-program computer beating the
United States’ EDVAC by two months. The EDSAC performed computations in the
three millisecond range. It performed arithmetic and logical operations without
human intervention. The key to the success was in the stored instructions which
it depended upon solely for its operation. This machine marked the beginning
of the computer age.
FIRST GENERATION (1951-1958)
John W. Mauchly and J. Presper
Eckert
(1951) the first generation of
computers started with the UNIVAC I (Universal Automatic Computer) built by
Mauchly and Eckert. It was sold to the U.S. Census Bureau. This machine was
dedicated to business data processing and not military or scientific purposes.
CHARACTERISTICS OF FIRST GENERATION
COMPUTERS
Use of vacuum tubes in
electronic circuits: These tubes controlled internal operations and were huge.
As a consequence the machines were large.
Magnetic drum
as primary internal-storage medium:
Electric currents passed through wires which magnetized the core to represent
on and off states
Limited main-storage capacity:
Slow input/output, punched-card-oriented: Operators performed input and
output operations through the use of punched cards.
Low level symbolic-language programming: The computer used machine language which was cumbersome
and accomplished through long strings of numbers made up of Zeroes and Ones. In
1952, Dr. Grace Hopper (University of Pennsylvania) developed a symbolic
language called mnemonics (instructions written with symbolic codes). Rather
than writing instructions with Zeroes and Ones, the mnemonics were translated
into binary code. Dr. Hopper developed the first set of programs or
instructions to tell computers how to translate the mnemonics.
Heat and maintenance problems: Special air-conditioning and maintenance were required of
the machines. The tubes gave off tremendous amounts of heat.
Applications: payroll processing and record keeping though still
oriented toward scientific applications thatn business data processing.
Examples: IBM 650 UNIVAC I
SECOND GENERATION COMPUTERS (1959-1964)
CHARACTERISTICS OF SECOND GENERATION
COMPUTERS
Use of transistors for internal operations: tiny solid state transitors
replace vacuum tubes in computers. The heat problem was then minimized and
computers could be made smaller and faster.
Magnetic core as primary internal-storage medium: Electric currents pass
through wires which magnetize the core to represent on and off states.Data in
the cores can be found and retrieved for processing in a few millionths of a
second.
Increased main-storage capacity: The internal or main storage was supplemented by use of
magnetic tapes for external storage. These tapes substituted for punched cards
or paper. Magnetic disks were also developed that stored information on
circular tracks that looked like phonograph records. The disks provided direct
or random access to records in a file.
Faster input/output; tape orientation: Devices could be connected directly to
the computer and considered "on-line". This allowed for faster
printing and detection and correction of errors.
High-level programming languages (COBOL,FORTRAN) : These languages resembled English.
FORTRAN (FORmula TRANslator) was the first high-level language that was
accepted widely. This language was used mostly for scientific applications.
COBOL (Common Business-Oriented Language) was developed in 1961 for business
data processing. Its main features include: file-processing, editing, and
input/output capabilites.
Increased speed and reliability: Modular-hardware was developed through the design of
electronic circuits. Complete modules called "breadboards" could be
replaced if malfunctions occurred, or the machine "crashed". This
decreased lost time and also new modules could be added for added features such
as file-processing, editing , and input/output features.
Batch-oriented applications:billing, payroll processing, updating and inventory files:
Batch processing allowed for collection of data over a period time and then one
processed in one computer run. The results were then stored on magnetic tapes.
Examples:IBM 1401*(most popular
business-oriented computer. Honeywell 200 CDC 1604
THIRD GENERATION COMPUTERS (1965-1970)
CHARACTERISTICS OF THIRD GENERATION COMPUTERS:
Use of integrated circuits: The use of integrated circuits (Ics) replaced the transistors
of the second-generation machines. The circuits are etched and printed and
hundreds of electronic components could be put on silicon circuit chips less
than one-eighth of an inch square.
Magnetic core and solid-state main
storage: Greater storage capacity was
developed.
More flexibility with input/output; disk-oriented:
Smaller size and better performance
and reliability: Advances in solid-state technology
allowed for the design and building of smaller and faster computers.
Breadboards could easily be replaced on the fly.
Extensive use of high-level
programming languages:
The software industry evolved during this time. Many
users found that it was more cost effective to buy pre-programmed packages than
to write the programs themselves. The programs from the second generation had
to be rewritten since many of the programs were based on second generation
architecture.
Emergence of minicomputers: The mini computers offered many of the same features as the
mainframe computers only on a smaller scale. These machines filled the needs of
the small business owner.
Remote processing and time-sharing through communication: Computers
were then able to perform several operations at the same time. Remote terminals
were developed to communicate with a central computer over a specific
geographic location. Time sharing environments were established.
Availability of operating-systems (software)
to control I/O and do tasks handled by human operators: Software was developed
to take care of routine tasks required of the computer freed up the human
operator.
Applications such as airline reservation systems, market forecasting,
credit card billing: The applications also included inventory, control, and
scheduling labor and materials. Multitasking was also accomplished. Both
scientific and business applications could be run on the same machine.
Examples: IBM System/360 NCR 395
Burroughs B6500
FOURTH GENERATION (1970- )
CHARACTERISTICS OF FOURTH GENERATION
COMPUTERS:
Use of large scale integrated
circuits
Increased storage capacity and speed.
Modular design and compatibility between equipment
Special application programs
Versatility of input/ output devices
Increased use of minicomputers
Introduction of microprocessors
and microcomputers
Applications: mathematical modeling and simulation, electronic funds
transfer, computer-aided instruction and home computers. Internet explosion.
KEY
POINTS
- Control Data Corporation: STAR
100 computer which has a vector based design. Information is processed as
vectors instead of numbers. This allows for faster speed when problems are
processed in vector form. Charles A. Burrus develops the (LED)
light-emitting diode. RCA develops (MOS) technology, a metal-oxide
semiconductor for the making of integrated circuits, making them cheaper
and faster to produce. The circuits can also be made smaller.
- Texas Instruments introduces
the first pocket calculator the Pocketronic. It can add subtract, multiply
and divide. It costs around $150.
- Odyssey developed by
Magnavox(first video game).Intel develops the first 8-bit microprocessor
chip the 8008. (Used in the Mark-8 personal mini-computer). Nolan Bushnell
invents a video game with a liquid crystal screen. The toy is called Pong.
Bushnell founds Atari.
- Using LSI (large scale
integration) ten thousand components are placed on a chip of 1 square
inch.
- Hewlett Packard introduces the
programmable pocket calculator. David Ahl develops a microcomputer
consisting of a video display, keyboard and central processing unit. D-RAM
(dynamic random access becomes commercially available and will be used in
the first personal computers.
- Edward Roberts introduces the
first personal computer call the Altair 8800 in kit form. It has 256 bytes
of memory.
- A computer chip with 16
kilobits (16,384 bits) of memory becomes commercially available. It will
be used in the first IBM personal computer.
- Steve P. Jobs and Stephen
Wozniak introduce the Apple II. The first personal computer in assembled
form. Xerox introduces the Star 8010 and office computer based on the Alto
developed a few years earlier. The first linked automatic teller machines
(ATMs) are introduced in Denver.
- DEC introduces a 32-bit
computer with a virtual address extension (VAX). It runs large programs
and becomes an industry standard for scientific and technical systems. Its
operating system is called a VMS. Intel introduces its first 16-bit
processor the 8086. The 8088 is used in the central processing unit in
their first PC.
- Control Data introduces Cyber
203 supercomputer. Motorola introduces the 68000 microprocessor chip. It
is a 24-bit capacity chip for reading memory and can address 16 megabytes
of memory. It will be the basis for the Macintosh computer developed by
Apple. Steven Hofstein invents the field-effect transistor using metal
oxide technology. (MOSFET)1881 IBM Personal Computer uses the industry
standard disk operating system. (DOS)
- IBM introduces the 5120
microcomputer. It is not successful.
- Osborne builds the first
portable computer in which disk drives, monitor, and processor are mounted
in a single box. It is the size of a suitcase. Clive Sinclair develops the
ZX81 which connects to a television receiver. Japanese produce 64 kilobit
chips (65,536 bits) of memory which captures the world market.
- Columbia Data Products
announces the first computer based on the IBM PC that run programs
designed for the IBM machine and gets the name "clones". Compaq
introduces its first IBM-PC clone that is portable. Japan starts a project
nationally funded to develop a fifth generation computer based on
artificial intelligence using the Prolog language.
- IBM’s PC-XT introduced. It is
the first personal computer with a hard drive built into the computer. It
can store 10 megabytes of information even when the machine is turned off.
It replaces many floppy diskettes. The machine is updated using DOS 2.0.
IBM introduces PC-JR a scaled down version of the IBM-PC. It is
unsuccessful. Immos, (British company) develops a transputer which several
processors are contained in one computer and they work simultaneously on
the same problem. Intel introduces the 8080, and 8 bit microprocessor that
replaces the 8008.
- Philips and Sony introduce the
CD-ROM (compact disk ready-only memory) an optical disk that can store
large amounts of information. Apple introduces the Macintosh, a graphics
based computer that use icons, a mouse and an intuitive interface derived
from the Lisa computer. IBM ‘s PC AT (advanced technology) computer
designed around the 16 bit Intel 80286 processor chip and running at 6 MHz
becomes the first personal computer to use a new chip to expand speed and
memory. Motorola introduces the 68020 version of the 68000 series of
microprocessors. It has a 32-bit processing and reading capacity. NEC
manufactures computer chips in Japan with 256 kilobits (262,144) of
computer memory. IBM introduces a megabit RAM (random access memory) chip
with four times the memory of earlier chips.
- Microsoft develops Windows for
the IBM-PC. Intel introduces the 80386, a 32-bit microprocessor. Masaki
Togai and Hiroyuki Watanabe develop a logic chip that operates on fuzzy
logic at Bell Labs.
- Compaq leaps past IBM by
introducing the DeskPro, a computer that uses an advanced 32-bit
microprocessor, the Intel 80386. It can run software faster than the
quickest 16-bit computer. Terry Sejnowski at Johns Hopkins in Baltimore
develops a neural network computer that can read text out loud without
knowing any pronunciation rules. The first DAT (digital audio tape) recorders
are developed in Japan.
- The Macintosh II and Macintosh
SE made by Apple become the most powerful personal computers.Sega
Electronics introduces a three dimensional video game. The images appear
three-dimensional. Telephones become available on commercial airplanes.
Computer chips are manufactured with a 1 megabyte (1000 kilobits or
1,048,576 bits) of computer memory. Japan also introduces an experimental
4 and 16 megabit chip.
- Motorola introduces it 32 bit
88000 series of RISC (reduced instruction set computing) microprocessors.
They can operate much faster than conventional chips.Compaq and Tandy
develop the EISA (Extended Industry Standard Architecture). Steven Jobs
introduces the NeXT Computer System. It is a graphical-based system that
includes 256 megabyte optical storage disk and 8 megabytes of RAM. Robert
Morris develops a computer virus that is planted in the Internet and
causes the whole system to go down for two days. Scriptel introduces a
method for inputing data into a computer by writing on a screen.
- Japan initiates daily
broadcasts of it analog version of high definition television. (HDTV).
Philips and Sony bring the videodisk to the open market. Seymour Cray
founds the Cray Computer Corporation.
- Bell Laboratories Alan Huang
demonstrates the first all-optical processor. Hewlett Packard announces a
computer with RISC processor. IBM later introduces the RS/6000 family of
RISC workstations. Computer chips introduced with 4 megabit of computer
memory. Intel introduces the i486 processor chip which can operate at 33
MHz. Intel also launches the iPSC/860 microprocessor that is designed for
multiprocessor computers. Motorola introduces the 68040 version of its
68000 series of microprocessors. The chip has 1.2 million transistors. IBM
develops a transistor that can operate at 75 billion cycles per second.
- The 64-megabyte dynamic random
access memory chip is invented. (D-RAM)
- IBM develops the silicon
insulator (SOI) bipolar transistor. It can operate at 20 GHz.
- Harry Jordan and Vincent
Heuring develop the first general purpose-all optical computer capable of
being programmed and manipulating instructions internally. Intel ships
their Pentium processor to computer manufacturers. It is the fifth
generation of the chip that powers the PC. The chip contains 3.1 million
transistors and is twice a fast as the fourth generation 486DX2. Fujitsu
in Japan announces of a 256 megabit memory chip.
OBSERVATION
The
world is changing rapidly and so is the explosion of information. The computer
is an ever changing and evolving beast. Currently quad core CPU is
the fastest on the market. It is advertised with speed of 2.4 or 2.66GHZ.
Components
are becoming smaller and computers are becoming faster. Multimedia and web
based publishing are the current trends.
There
is a rush to incorporate networks and Internet access into the schools, and
with the development of Internet II; virtual reality seems to becoming closer
to a reality.
The
world is now global, thanks to computers.