History of Computers This chapter is a brief summary of the history of Computers. It is supplemented by the two PBS documentaries video tapes "Inventing the Future" And "The Paperback Computer". The chapter highlights some of the advances to look for in the documentaries. In particular, when viewing the movies you should look for two things: The progression in hardware representation of a bit of data: 1. Vacuum Tubes (1950s) - one bit on the size of a thumb; 2. Transistors (1950s and 1960s) - one bit on the size of a fingernail; 3. Integrated Circuits (1960s and 70s) - thousands of bits on the size of a hand 4. Silicon computer chips (1970s and on) - millions of bits on the size of a finger nail. The progression of the ease of use of computers: 1. Almost impossible to use except by very patient geniuses (1950s); 2. Programmable by highly trained people only (1960s and 1970s); 3. Useable by just about anyone (1980s and on). to see how computers got smaller, cheaper, and easier to use.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
History of ComputersThis chapter is a brief summary of the history of Computers. It is supplemented by the two PBS documentaries video tapes "Inventing the Future" And "The Paperback Computer". The chapter highlights some of the advances to look for in the documentaries.
In particular, when viewing the movies you should look for two things:
The progression in hardware representation of a bit of data:1. Vacuum Tubes (1950s) - one bit on the size of a thumb;2. Transistors (1950s and 1960s) - one bit on the size of a fingernail;3. Integrated Circuits (1960s and 70s) - thousands of bits on the size of
a hand4. Silicon computer chips (1970s and on) - millions of bits on the size of
a finger nail.
The progression of the ease of use of computers:1. Almost impossible to use except by very patient geniuses (1950s);
2. Programmable by highly trained people only (1960s and 1970s);
3. Useable by just about anyone (1980s and on).
to see how computers got smaller, cheaper, and easier to use.
First Computers
The first substantial computer was the giant ENIAC machine by John W. Mauchly and J. Presper Eckert at the University of Pennsylvania. ENIAC (Electrical Numerical Integrator and Calculator) used a word of 10 decimal
Eniac Computer
digits instead of binary ones like previous automated calculators/computers. ENIAC was also the first machine to use more than 2,000 vacuum tubes, using nearly 18,000 vacuum tubes. Storage of all those vacuum tubes and the machinery required to keep the cool took up over 167 square meters (1800 square feet) of floor space. Nonetheless, it had punched-card input and output and arithmetically had 1 multiplier, 1 divider-square rooter, and 20 adders employing decimal "ring counters," which served as adders and also as quick-access (0.0002 seconds) read-write register storage.
The executable instructions composing a program were embodied in the separate units of ENIAC, which were plugged together to form a route through the machine for the flow of computations. These connections had to be redone for each different problem, together with presetting function tables and switches. This "wire-your-own" instruction technique was inconvenient, and only with some license could ENIAC be considered programmable; it was, however, efficient in handling the particular programs for which it had been designed. ENIAC is generally acknowledged to be the first successful high-speed electronic digital computer (EDC) and was productively used from 1946 to 1955. A controversy developed in 1971, however, over the patentability of ENIAC's basic digital concepts, the claim being made that another U.S. physicist, John V. Atanasoff, had already used the same ideas in a simpler vacuum-tube device he built in the 1930s while at Iowa State College. In 1973, the court found in favor of the company using Atanasoff claim and Atanasoff received the acclaim he rightly deserved.
Progression of Hardware
In the 1950's two devices would be invented that would improve the computer field and set in motion the beginning of the computer revolution. The first of these two devices was the transistor. Invented in 1947 by William Shockley, John Bardeen, and Walter Brattain of Bell Labs, the transistor was fated to oust the days of vacuum tubes in computers, radios, and other electronics.
The vacuum tube, used up to this time in almost all the computers and calculating machines, had been invented by American physicist Lee De Forest in 1906. The vacuum tube, which is about the size of a human thumb, worked by using large amounts of electricity to heat a filament inside the tube until it was
Vaccum Tubes
cherry red. One result of heating this filament up was the release of electrons into the tube, which could be controlled by other elements within the tube. De Forest's original device was a triode, which could control the flow of electrons to a positively charged plate inside the tube. A zero could then be represented by the absence of an electron current to the plate; the presence of a small but detectable current to the plate represented a one.
Vacuum tubes were highly inefficient, required a great deal of space, and needed to be replaced often. Computers of the 1940s and 50s had 18,000 tubes in them and housing all these tubes and cooling the rooms from the heat produced by 18,000 tubes was not cheap. The transistor promised to solve all of these problems and it did so. Transistors, however, had their problems too. The main problem was that transistors, like other electronic components, needed to be soldered together. As a result, the more complex the circuits became, the more complicated and
numerous the connections between the individual transistors and the likelihood of faulty wiring increased.
In 1958, this problem too was solved by Jack St. Clair Kilby of Texas Instruments. He manufactured the first integrated circuit or chip. A chip is really a collection of tiny transistors which are connected together when the transistor is manufactured. Thus, the need for soldering together large numbers of transistors was practically nullified; now only connections were needed to other electronic components. In addition to saving space, the speed of the machine was now increased since there was a diminished distance that the electrons had to follow.
Circuit Board Silicon Chip
Mainframes to PCs
The 1960s saw large mainframe computers become much more common in large industries and with the US military and space program. IBM became
Transistors
the unquestioned market leader in selling these large, expensive, error-prone, and very hard to use machines.
A veritable explosion of personal computers occurred in the early 1970s, starting with Steve Jobs and Steve Wozniak exhibiting the first Apple II at the First West Coast Computer Faire in San Francisco. The Apple II boasted built-in BASIC programming language, color graphics, and a 4100 character memory for only $1298. Programs and data could be stored on an everyday audio-cassette recorder. Before the end of the fair, Wozniak and Jobs had secured 300 orders for the Apple II and from there Apple just took off.
Also introduced in 1977 was the TRS-80. This was a home computer manufactured by Tandy Radio Shack. In its second incarnation, the TRS-80 Model II, came complete with a 64,000 character memory and a disk drive to store programs and data on. At this time, only Apple and TRS had machines with disk drives. With the introduction of the disk drive, personal computer applications took off as a floppy disk was a most convenient publishing medium for distribution of software.
IBM, which up to this time had been producing mainframes and minicomputers for medium to large-sized businesses, decided that it had to get into the act and started working on the Acorn, which would later be called the IBM PC. The PC was the first computer designed for the home market which would feature modular design so that pieces could easily be added to the architecture. Most of the components, surprisingly, came from outside of IBM, since building it with IBM parts would have cost too much for the home computer market. When it was introduced, the PC came with a 16,000 character memory, keyboard from an IBM electric typewriter, and a connection for tape cassette player for $1265.
By 1984, Apple and IBM had come out with new models. Apple released the first generation Macintosh, which was the first computer to come with a graphical user interface(GUI) and a mouse. The GUI made the machine much more attractive to home computer users because it was easy to use. Sales of the Macintosh soared like nothing ever seen before. IBM was hot on Apple's tail and released the 286-AT, which with applications like Lotus 1-2-3, a spreadsheet, and Microsoft Word, quickly became the favourite of business concerns.
That brings us up to about ten years ago. Now people have their own personal graphics workstations and powerful home computers. The average computer a person might have in their home is more powerful by several orders of magnitude than a machine like ENIAC. The computer revolution has been the fastest growing technology in man's history.
Computer
A computer is a general-purpose device that can be programmed to carry out
a set of arithmetic or logicaloperations automatically. Since a sequence of
operations can be readily changed, the computer can solve more than one
kind of problem.
Conventionally, a computer consists of at least one processing element,
typically a central processing unit(CPU), and some form of memory. The
processing element carries out arithmetic and logic operations, and a
sequencing and control unit can change the order of operations in response to
stored information. Peripheral devices allow information to be retrieved from
an external source, and the result of operations saved and retrieved.
Mechanical analog computers started appearing in the first century and were
later used in the medieval era for astronomical calculations. In World War II,
mechanical analog computers were used for specialized military applications
such as calculating torpedo aiming. During this time the first
electronic digital computers were developed. Originally they were the size of a
large room, consuming as much power as several hundred modern personal
computers (PCs).[1]
Modern computers based on integrated circuits are millions to billions of times
more capable than the early machines, and occupy a fraction of the space.[2]Computers are small enough to fit into mobile devices, and mobile
computers can be powered by small batteries. Personal computers in their
various forms are iconsof the Information Age and are generally considered as
"computers". However, the embedded computers found in many devices
from MP3 players to fighter aircraftand from electronic toys to industrial
robots are the most numerous.
Contents
[show]
Etymology
The first known use of the word "computer" was in 1613 in a book called The
Yong Mans Gleanings by English writer Richard Braithwait: "I haue read the
truest computer of Times, and the best Arithmetician that euer breathed, and
he reduceth thy dayes into a short number." It referred to a person who carried
electronics for the telephone exchange. Experimental equipment that he built
in 1934 went into operation 5 years later, converting a portion of the telephone
exchange network into an electronic data processing system, using thousands
of vacuum tubes.[19] In the US, John Vincent Atanasoff and Clifford E. Berry of
Iowa State University developed and tested the Atanasoff–Berry
Computer (ABC) in 1942,[27] the first "automatic electronic digital computer".[28] This design was also all-electronic and used about 300 vacuum tubes, with
capacitors fixed in a mechanically rotating drum for memory.[29]
Colossus was the first electronic digital programmable computing device, and was used to
break German ciphers during World War II.
During World War II, the British at Bletchley Park achieved a number of
successes at breaking encrypted German military communications. The
German encryption machine, Enigma, was first attacked with the help of the
electro-mechanicalbombes. To crack the more sophisticated German Lorenz
SZ 40/42 machine, used for high-level Army communications, Max
Newman and his colleagues commissioned Flowers to build the Colossus.[29] He spent eleven months from early February 1943 designing and building
the first Colossus.[30] After a functional test in December 1943, Colossus was
shipped to Bletchley Park, where it was delivered on 18 January 1944[31] and
attacked its first message on 5 February.[29]
Colossus was the world's first electronic digital programmable computer.[19] It
used a large number of valves (vacuum tubes). It had paper-tape input and
was capable of being configured to perform a variety of boolean
logical operations on its data, but it was not Turing-complete. Nine Mk II
Colossi were built (The Mk I was converted to a Mk II making ten machines in
total). Colossus Mark I contained 1500 thermionic valves (tubes), but Mark II
At the University of Manchester, a team under the leadership of Tom
Kilburn designed and built a machine using the newly
developed transistors instead of valves.[42] Their first transistorised
computer and the first in the world, was operational by 1953, and a second
version was completed there in April 1955. However, the machine did make
use of valves to generate its 125 kHz clock waveforms and in the circuitry to
read and write on its magnetic drum memory, so it was not the first completely
transistorized computer. That distinction goes to the Harwell CADET of 1955,[43] built by the electronics division of the Atomic Energy Research
Establishment at Harwell.[44][45]
Integrated circuits
The next great advance in computing power came with the advent of
the integrated circuit. The idea of the integrated circuit was first conceived by
a radar scientist working for the Royal Radar Establishment of the Ministry of
Defence, Geoffrey W.A. Dummer. Dummer presented the first public
description of an integrated circuit at the Symposium on Progress in Quality
Electronic Components in Washington, D.C. on 7 May 1952.[46]
The first practical ICs were invented by Jack Kilby at Texas
Instruments and Robert Noyce at Fairchild Semiconductor.[47] Kilby recorded
his initial ideas concerning the integrated circuit in July 1958, successfully
demonstrating the first working integrated example on 12 September 1958.[48] In his patent application of 6 February 1959, Kilby described his new device
as "a body of semiconductor material ... wherein all the components of the
electronic circuit are completely integrated".[49][50]Noyce also came up with his
own idea of an integrated circuit half a year later than Kilby.[51] His chip solved
many practical problems that Kilby's had not. Produced at Fairchild
Semiconductor, it was made of silicon, whereas Kilby's chip was made
of germanium.
This new development heralded an explosion in the commercial and personal
use of computers and led to the invention of the microprocessor. While the
subject of exactly which device was the first microprocessor is contentious,
partly due to lack of agreement on the exact definition of the term
"microprocessor", it is largely undisputed that the first single-chip
microprocessor was the Intel 4004,[52] designed and realized by Ted
Hoff, Federico Faggin, and Stanley Mazor at Intel.[53]
Mobile computers become dominant
With the continued miniaturization of computing resources, and advancements
in portable battery life, portable computers grew in popularity in the 2000s.[54] The same developments that spurred the growth of laptop computers and
other portable computers allowed manufacturers to integrate computing
resources into cellular phones. These so-called smartphones and tablets run
on a variety of operating systems and have become the dominant computing
device on the market, with manufacturers reporting having shipped an
estimated 237 million devices in 2Q 2013.[55]
Programs
The defining feature of modern computers which distinguishes them from all
other machines is that they can be programmed. That is to say that some type
ofinstructions (the program) can be given to the computer, and it will process
them. Modern computers based on the von Neumann architecture often have
machine code in the form of an imperative programming language.
In practical terms, a computer program may be just a few instructions or
extend to many millions of instructions, as do the programs for word
processors and web browsers for example. A typical modern computer can
execute billions of instructions per second (gigaflops) and rarely makes a
mistake over many years of operation. Large computer programs consisting of
several million instructions may take teams of programmers years to write,
and due to the complexity of the task almost certainly contain errors.