Computers Essay, Research Paper
Computers
A common misconception about computers is that they are smarter
than humans. Actually, the degree of a computer+s intelligence
depends on the speed of its ignorance. Today+s complex computers
are not really intelligent at all. The intelligence is in the
people who design them. Therefore, in order to understand the
intelligence of computers, one must first look at the history of
computers, the way computers handle information, and, finally, the
methods of programming the machines.
The predecessor to today+s computers was nothing like the machines
we use today. The first known computer was Charles Babbage+s
Analytical Engine; designed in 1834. (Constable 9) It was a
remarkable device for its time. In fact, the Analytical Engine
required so much power and would have been so much more complex
than the manufacturing methods of the time, it could never be
built.
No more than twenty years after Babbage+s death, Herman Hollerith
designed an electromechanical machine that used punched cards to
tabulate the 1890 U.S. Census. His tabulation machine was so
successful, he formed IBM to supply them. (Constable 11) The
computers of those times worked with gears and mechanical
computation.
Unlike today+s chip computers, the first computers were
non-programmable, electromechnical machines. No one would ever
confuse the limited power of those early machines with the wonder
of the human brain. An example was the ENIAC, or Electronic
Numerical Integrator and Computer. It was a huge, room-sized
machine, designed to calculate artillery firing tables for the
military. (Constable 9) ENIAC was built with more than 19,000
vacuum tubes, nine times the amount ever used prior to this. The
internal memory of ENIAC was a paltry twenty decimal numbers of ten
digits each. (Constable 12) (Today+s average home computer can hold
roughly 20,480 times this amount.)
Today, the chip-based computer easily packs the power of more than
10,000 ENIACs into a silicon chip the size of an infant+s
fingertip. (Reid 64) The chip itself was invented by Jack Kilby and
Robert Noyce in 1958, but their crude devices looked nothing like
the sleek, paper-thin devices common now. (Reid 66) The first
integrated circuit had but four transistors and was half an inch
long and narrower than a toothpick. Chips found in today+s PCs,
such as the Motorola 68040, cram more than 1.2 million transistors
onto a chip half an inch square. (Poole 136)
The ENIAC was an extremely expensive, huge and complex machine,
while PCs now are shoebox-sized gadgets costing but a few thousand
dollars. Because of the incredible miniaturization that has taken
place, and because of the seemingly “magical” speed at which a
computer accomplishes its tasks, many people look at the computer
as a replacement for the human brain. Once again, though, the
computer can only accomplish its amazing feats by breaking down
every task into its simplest possible choices.
Of course, the computer must receive, process and store data in
order to be a useful tool. Data can be text, programs, sounds,
video, graphics, etc. Some devices for entering data are keyboards,
mice, scanners, pressure-sensitive tablets, or any instrument that
tells the computer something. The keyboard is the most popular
input device for entering text, commands, programs, and the like.
(Tessler 157) Newer computers which use a GUI (pronounced gooey),
or Graphical User Interface, utilize a mouse as the main device for
entering commands. A mouse is a small tool with at least one button
on it, and a small tracking ball at the bottom. When the mouse is
slid across a surface, the ball tracks the movement on the screen
and sends the information to the computer. (Tessler 155) A
pressure-sensitive tablet is mainly used by graphic artists to
easily draw with the computer. The artist uses a special pen to
draw on the large tablet, and the tablet sends the data to the
computer.
Once the data is entered into the computer, it does no good until
the computer can process it. This is accomplished by the millions
of transistors compressed into the thumb-nail sized chip in the
computer. These transistors are not at all randomly placed; they
form a sequence, and together they make a circuit. A transistor
alone can only turn on and off. In the “on” state, it will permit
electricity to flow; in the “off” state, it will keep electricity
from flowing. (Poole 136) However, when all the microscopic
transistors are interconnected, they have the ability to control,
manipulate, and move data according to the condition of other data.
A computer+s chip is so ignorant, it must use a series of sixteen
transistors and two resistors just to add two and two. (Poole 141)
Nevertheless, this calculation can be made in just a microsecond,
an example of the incredible speed of the PC. The type of chip
mainly used now is known as a CISC, or Complex Instruction Set
Chip. (Constable 98)
Newer workstation variety computers use the RISC type of chip,
which stands for Reduced Instruction Set Chip. While the “complex”
type might sound better, the architecture of the RISC chip permits
it to work faster. The first generation of CISC chip was called
SSI, or Small Scale Integration. SSI chips have fewer than one
hundred components. (Reid 124) The period of the late 1960s is
known as the era of MSI, or Medium Scale Integration. MSI chips
range from one hundred to one thousand components each. (Reid 124)
LSI, or Large Scale Integration, was used primarily in the 1970s,
each chip containing up to ten thousand components.
Chips used in
the 1990s are known as VLSI, or Very Large Scale Integration, with
up to a million or more components per chip. In the not-so-distant
future, ULSI, or Ultra Large Scale Integration, will be the final
limit of the miniaturization of the chip.
The transistors will then be on the atomic level and the
interconnections will be one atom apart. (Reid 124) Because further
miniaturization is not practical parallel” systems that split jobs
among hundreds of processors will become common in the future.
Once data is entered and processed, it will be lost forever if it
is not stored. Computers can store information in a variety of
ways. The computer+s permanent read-only memory, which it uses for
basic tasks such as system checks, is stored in ROM, or Read Only
Memory. Programs, files, and system software are stored on either a
hard disk or floppy disk in most systems.
The hard disk and floppy disk function similarly, but hard disks
can hold much more information. They work by magnetizing and
demagnetizing small areas on a plastic or metal platter. The “read”
head then moves along the tracks to read the binary information.
When the program or file being read is opened, it is loaded into
RAM (Random Access Memory) where it can be quickly accessed by the
processor. RAM is in small chips called SIMMs, or Single Inline
Memory Modules. The speed of RAM is much faster than a disk drive
because there are no moving parts. The information is represented
by either a one or a zero, and this amount of information is called
a bit. (Constable 122) Four bits make a nybble, and two nybbles
make a byte. One byte can hold one character, such as “A” or “?”.
1024 bytes make a kilobyte, 1000 kilobytes make a megabyte, 1000
megabytes make a gigabyte, and 1000 gigabytes make a terabyte. Most
personal computers have approximately eighty or so megabytes of
hard drive space and either two or four megabytes of RAM on
average. Most ROM on PCs is about 256 kilobytes.
Machine language is the way all computer handle instructions-the
simple, one or zero, yes or no, true or false boolean logic
necessary for computers. (Reid 122) Boolean logic was invented by
George Boole, a poor British mathematician in 1815. His new type of
logic was mostly ignored until makers of computers more than a
century later realized his was the ideal system of logic for the
computers binary system. Machine code is the only programming
“language” the computer understands. Unfortunately, the endless and
seemingly random strings of ones and zeros is almost
incomprehensible by humans.
Not long after the computers such as ENIAC came along, programmers
began to develop simple mnemonic “words” to stand in the place of
the crude machine code. The words still had to be changed into
machine code to be run, though. This simple advancement greatly
helped the programmers with their tasks. Even with these
improvements, the process of programming was still a mind-boggling
task.
The so-called high-level languages are the type used for
programming in the 90s. Rarely is there ever a need today for
programming in machine code. The way a high-level language works is
by converting the English-based commands into machine code by way
of an Assembler program. (Constable 122) There are two types of
Assembler programs: Compilers and Interpreters. A compiler converts
the entire program into machine code. The interpreter is only
capable of converting one line at a time.
The first compiler language was Fortran. Fortran became quite
popular after its release in 1957 and is still used for some
purposes to this day. Cobol is another high-level compiler language
that has been used widely in the business world from 1960 until
now. A compiler must be utilized before a program can be run. The
compiler translates the program into the ones and zeros of binary
machine code. There are many compiler languages used today, such as
C and Pascal, named for the French genius Blaise Pascal. These two
languages are the most popular high-level languages used for
application development.
The interpreter languages are better suited for home computers than
business needs; they are less powerful, but much simpler to use. An
interpreter language is translated into machine code and sent to
the processor one line of code at a time. The first popular
interpreter language was BASIC, or Beginner+s All-purpose Symbolic
Instruction Code, written by John Kemeny and Toms Kurtz at
Dartmouth College. BASIC is still a much-used language, and is
included free with many PCs sold today. BASIC was the first
programming language to use the INPUT command, which allows the
user to input information into the program as it is running.
(Constable 29) Another newer and less popular interpreter language
is Hypertalk, a language that is very English-like and easy to
understand. It is included free with every Macintosh computer.
There are advantages and disadvantages to both the compiler and the
interpreter languages. The interpreter languages lack speed;
however, because they compile as they run, they are very easily
“debugged” or fixed and changed. Before the programmer using a
compiler language can try out his program, he must wait for the
compiler to translate his program into machine code and then change
it later. With an interpreter language, on the other hand, the ease
of modification comes with the price of slower performance and
limited capabilities.
The history of computers, the way computers handle information, and
the methods of programming all confirm that computers will never be
as intelligent as the people who will design them.
Computers Essay Research Paper ComputersA common misconception
90
0
6 минут
Темы:
Понравилась работу? Лайкни ее и оставь свой комментарий!
Для автора это очень важно, это стимулирует его на новое творчество!