Top Banner

Click here to load reader

History of Computers

Feb 24, 2016

ReportDownload

Documents

emile

History of Computers. Tori Harbin 3 rd period. Who invented the first computer?. 1837 Charles Babbage, a British professor of mathematics, had an idea for Analytical Engine, the first stored-program mechanical computer. Charles Babbage is known as the “Father of Computing”. - PowerPoint PPT Presentation

History of Computers

History of ComputersTori Harbin3rd period

Who invented the first computer?1837 Charles Babbage, a British professor of mathematics, had an idea for Analytical Engine, the first stored-program mechanical computer.Charles Babbage is known as the Father of Computing.In 1939 John V. Atanasoff and Clifford Berry developed the Atanasoff-Berry computer (ABC).ABC computer was built by hand.1943 the ENIAC (Electronic Numerical Integrator and Computer) was made.The UNIVAC I (Universal Automatic Computer) was the first commercially available, mass produced electronic computer manufactured by Remington Rand in the USA and was delivered to the US Census Bureau in June 1951.By the 1990s, the microcomputer or Personal Computer (PC) became a common household appliance, and became even more widespread with the advent of the Internet.

What were computers made of?First Generation (c1950): These were built from electronic valve (vacuum tube) technology. Consequently they were large, consumed a great deal of electrical power, and were not reliable enough for continuous operation for days at a stretch. They were only suitable for enthusiasts and people who had to have them. Second Generation (c1960): These were built from transistors. These were so much smaller, consumed so much less power, and were so long-lived compared to valves, that these were the first computers which could be successfully marketed. Third Generation (c1970): These were built from Large Scale Integration (LSI) of many transistors onto a single chip. This permitted the diversification of the computer market into two distinct areas. The first used the technology to build much more powerful computers at the same size and price as before, still called ``mainframes''. The second used the technology to build small cheap computers which offered the power of the previous generation of mainframes at a fraction of the size and cost. These were called minicomputers. Fourth Generation (c1980): These were built from Very Large Scale Integration (VLSI) of many thousands of transistors onto a single chip. This permitted the diversification of the computer market into two distinct areas. The first used the technology to build much more powerful computers at the same size and price as before, still called ``mainframes'' and ``minicomputers''. The second used the technology to build small cheap computers which offered the power of the previous generation of minicomputers at a fraction of the size and cost. These were called microcomputers. These were personal computers, since they were small and cheap enough that being idle when the owners weren't using them didn't matter. Fifth Generation (c1990): The Japanese caused some alarm in the US and UK by announcing their intention to develop the Fifth Generation based on ideas developed in Artificial Intelligence research. Unlike previous generations this was not based on a clear hardware discontinuity, and the project in the end partly fell short of its original goals, and was partly overtaken by events. No really good candidate for the label of Fifth Generation has turned up yet, although one possibility is the linking of computers via the Internet and the World Wide Web, so that they have become participants in a very large and rapidly growing world-wide global information store. While the ideas and early implementations are decades old (the Internet began as the DARPA Defence Research net in the US and as JANet [Joint Academic Network] in the UK), it was only in the 1990s that large numbers of the general public acquired their own personal computers with Internet and Web capability as a standard feature.

Prices of computers.As you can see the price of computers has dropped drastically over the years. From 1999 to 2003 the index dropped over 20% every year. Lately the rate of price drop has slowed but the prices for computers are still dropping. The index has decreased at a rate of 11-12% annually for the last 3 years.Remember this is an index of prices so its not directly related to a specific dollar amount for a specific item. They are looking at personal computers and peripheral equipment prices as a whole so its a combination of everything in that category: very expensive computers, cheap computers, monitors, printers, etc. So while the overall prices may have gone down from a 700 level to 100 level that doesn't mean that any individual computer part you buy today will cost 1/7th of what it cost 10 years ago. For example the cheapest Dell system today costs about $400 but the cheapest Dell 10 years ago was not 7 times that much. Overall this kind of price trend does matches reality pretty well. I remember in the late 1980's that an IBM PC would cost in the ballpark of $3000. I remember spending about $1500 for my first real PC back around 1994. Today you can buy a decent Dell system for around $400. A printer today can be bought for as little as $25-$50 on sale but 10 years ago they'd easily run $150-$200 minimum.From the index and from direct observation, its clear that personal computer prices have been steadily decreasing over the past decade.

Uses of computer.Uses of a personal computer Like other computers, personal computers can be instructed to perform a variety of individual functions. A set of instructions that tells a computer what to do is called a program. Today, more than 10,000 application programs are available for use on personal computers. They include such popular programs as word processing programs, spreadsheet programs, database programs, and communication programs. Word processing programs are used to type, correct, rearrange, or delete text in letters, memos, reports, and school assignments. Spreadsheet programs enable individuals to prepare tables easily. The users of such programs establish rules for handling large groups of numbers. For example, using a spreadsheet program, a person can enter some numbers into a table and the program will calculate and fill in the rest of the table. When the user changes one number in the table, the other numbers will change according to the rules established by that user. Spreadsheets may be used for preparing budgets and financial plans, balancing a chequebook, or keeping track of personal investments. Database programs allow a computer to store large amounts of data (information) in a systematic way. Such data might include the name, address, telephone number, salary, and starting date of every employee in a company. The computer could then be asked to produce a list of all employees who receive a certain salary. Communication programs connect a personal computer to other computers. People can thereby exchange information with one another via their personal computers. In addition, communication programs enable people to link their personal computers with databanks. Databanks are huge collections of information stored in large centralized computers. News, financial and travel information, and other data of interest to many users can be obtained from a databank. Other programs include recreational and educational programs for playing games, composing and hearing music, and learning a variety of subjects. Programs have also been written that turn household appliances on and off. Some people develop their own programs to meet needs not covered by commercially prepared programs. Others buy personal computers mainly to learn about computers and how to program them.

How much electricty does the computer use?A typical desktop computer uses about 65 to 250 watts. To find the figure for your particular computer you can contact the manufacturer (not me), or see my section on measuring electrical use.Add another 17-72 watts for an LCD monitor, or about 80 watts if you have an old-school 17" CRT. Don't forget related devices. My cable modem uses 7 watts, my D-Link DI-604 router uses 4.5 watts, and my Motorola phone box for use with Vonage uses 2 watts while idle (3 when I'm on the phone).Most laptop computers use about 15-45 watts, far less than desktops.With most devices you can look at the label to see how much energy they use, but that doesn't work so well with computers because the label gives the theoretical maximum, not the typical amount used. A computer whose label or power supply says 300 watts might only use about 70 watts when it's actually running, and only 100 even in peak times with serious number-crunching and all the drives spinning.As long as your computer goes into sleep/standby when you're not using it, your computer doesn't use squat for electricity, compared to the rest of your household. You'll save a lot more energy by addressing your heating, cooling, and lighting use rather than obsessing over your computer. For most people, their computers' energy use is not a significant portion of their total use, even if they use their computers a lot. Of course, you should absolutely make sure your computer is set to sleep automatically when you're not using it, because it's silly to waste energy, but your computer likely isn't even close to being the biggest energy-waster in your home. How much does it coast to run your computer?To calculate your costs use this formula:Watts x Hours Usedx Cost per kilowatt-hour = Total Cost1000For example, let's say you have a big high-end computer with a gaming-level graphics card and an old CRT monitor, and you leave them on 24/7. That's about 330 watts x 24 hours x 365 days/yr = 2,890,800 watt-hours, or 2891 kilowatt-hours. If you're paying $0.20 per kWh, you're paying $578 a year to run your computer.Let's try a different example: You have a computer that's less of an energy hog, like in iMac G5 20", which uses about 105 watts, and you're smart enough to turn it off when you're not using it. You use it for two hours a day, five days a week. That's ten hours a week, or 520 hours a year. So your 105 watts times 520 hours = 54,600 watt-hours. Divide by 1000