Structure of the microprocessor
bernard113314 de Junio de 2013
4.916 Palabras (20 Páginas)406 Visitas
The microprocessor owes its phenomenal success to a paradox created by the combination of technology and economics. Due to techniques that squeeze roughly twice as many circuits onto silicon every 18 months or so by decreasing line widths, increasing wafer diameters, and adding layers, each new generation eventually comes to market at around the same price as the last, but with twice the power. More compact circuitry makes microprocessors faster because electrons have less distance to travel. As chips get smaller, more of them can be etched onto the same diameter silicon wafer by improved fabrication equipment that today handles multiple-layer eight-inch wafers as easily as it did two- and three-inch, single-layer wafers ten years ago. Consequently, microprocessors have taken over functions that used to require warehouses of discrete components, whetting a seemingly limitless appetite for increasingly affordable, higher performing chips.
Over the last decade, these advances in pricing and processing power have made the personal computer the largest consumer of microprocessors. At the same time, microprocessors have transformed the ubiquitous PC from a stand-alone office workhorse doing word-processing and spreadsheets to a widely connected, information machine that can send faxes and e-mail, access on-line services, or provide a video link for meetings. Today, Pentium processors and clones are driving the PC into untapped, new frontiers of mass-market communications and interactive multimedia home computing. By the turn of the century, when high volume chips are capable of executing more than a billion instructions per second, doors will open to brave new worlds we can only begin to imagine such as holographic videoconferencing and personal digital assistants that beep your cardiologist when your stock portfolio slides.
The Pentium microprocessor (actual size)
HOW MICROPROCESSORS WORK
Microprocessor---A microprocessor, also called a CPU, is a tiny, enormously powerful high speed electronic brain etched on a single silicon semiconductor chip which contains the basic logic, storage and arithmetic functions of a computer. It thinks for the computer and, like a traffic cop, coordinates its operations. It receives and decodes instructions from input devices like keyboards, disks, or cassettes, then sends them over a bus system consisting of microscopic etched conductive "wiring" to be processed by its arithmetic calculator and logic unit. The results are temporarily stored in memory cells and released in a timed sequence through the bus system to output devices such as CRT Screens, networks, or printers.
The first microprocessor, the Intel 4004 4-bit (1971), measured just 1/8" by 1/16" yet was as powerful as the first electronic computer 25 years earlier (1946), which weighted 30 tons, used 18,000 vacuum tubes, and required so much electricity that the lights of West Philadelphia are said to have dimmed each time it was turned on. Today, DEC's 64-bit Alpha microprocessor is more than 550 times as powerful as the 4004, with speeds comparable to yesterday's mainframes.
Programmability---CPUs can be programmed either by the chip manufacturer, distributor, or the computer manufacturer. Those programs for a single purpose product like a calculator or a video game, are generally written by the OEM and entered into memory by the CPU manufacturer or distributor. For PCs, the CPU, which must perform a wide range of tasks, is generally programmed by the computer's manufacturer. The user merely inserts a prerecorded cassette tape, cartridge, or floppy disk containing instructions for each application into the computer and the CPU performs the instructions.
Key Components---A microprocessor has five key components; an arithmetic and logic unit (ALU), which calculates and thinks logically; registers, which are memory cells that store information temporarily for the ALU; a control unit, which decodes input instructions, and acts as a traffic cop; bus systems, which are submicron wiring routes connecting the entire system; and a clock, which times the sequential release of the processed data.
Computer Interface---In addition to a CPU, a computer requires memory and parts for connecting it to input/output devices. A device that includes a CPU, memory and input/output ports on a single chip is called a microcontroller. The two basic types of memory are RAM (Random Access Memory) and ROM (Read Only Memory). RAM stores modifiable programs and data that the user needs to perform immediate tasks. ROM stores unalterable programs that govern the computer's standard operations. Input devices like keyboards, mouses, and programs on cassette tape, cartridges, disks, and CD-ROM enable us to communicate with the computer. Output devices like monitors, printers, and moderns enable computers to communicate with each other.
It is astonishing how electronic computer technology has transformed our world considering it is only 50 years old and the microprocessor, which revolutionized computers, is less than 25 years old. Its development involved the convergence of three developing technologies: the calculator, the computer and the transistor.
The first "digital calculators"--fingers and thumbs--are still in use, as is the 5,000 year old abacus. Calculators go back to 17th century inventions, including Schickard's Calculating Clock (1623), Pascal's Pascaline (1642), and Leibriz's Stepped Reckoner (1673)--machine that used hand-made gears and wheels to do arithmetic.
The next breakthrough came as a result of the Industrial Revolution during the first half of the 19th century with Charles Babbage's steam powered mechanical computing and printing machines: The Difference Engine and the Analytical Engine. Although never completed, their designs included a "mill" to carry out arithmetic processes, a "store" to hold data, a "barrel" to control the operation and "punch cards" to provide the external program--all fundamentals of modern computers.
Business calculators appeared next. In 1884, Felt invented the key-driven mechanical calculator. In 1893, Steiger introduced a mass-produced, automated version of Leibniz's machine, widely used for scientific calculations.
Twentieth Century---At the turn of the century, a flood of immigration created a major problem for US Census takers. To hand sort the 1890 census information would have taken a decade, rendering the data virtually useless. But in 1889, Herman Hollerith came up with a solution: the first electromechanical machine for recording and tabulating digital material.
In the 1920's and 30's, the use of punch card equipment for data processing expanded. In 1933, Columbia University received an endowment of punch card and accounting machines for IBM's Thomas Watson which led Wallace Eckert to create a mechanical program to link them together, closing the gap between calculators and future computers. Shannon and Stibitz's later discovery that relay circuits could perform binary math, provided the base for the rise of the electronic computer.
World War II---During World War II, military requirements for gun trajectories and code breaking accelerated computer research. Though Germany showed little interest in computers, in 1938, Konrad Zuse, a German, built the Z3, the first electromechanical, general-purpose, program-controlled computer, later destroyed by Allied bombs. In 1943, the British built Alan Turing's COLOSSUS, an electronic calculating crypto analysis machine that cracked ENIGMA, the Nazi code machine--a crucial breakthrough in winning the war.
In 1946, ENIAC, the first general purpose, program controlled, all electronic, vacuum tube-based digital computer was completed at the University of Pennsylvania to solve ballistic equations. ENIAC worked 500 times faster than the earlier electromechanical Harvard Mark I (1937-44), which doomed the electromechanical approach. Other electronic computers quickly followed, including the Cambridge University built EDSAC (1946-49), the first full-scale stored program computer, and the MIT Whirlwind (1945-50), the first interactive, parallel, real-time computer.
Vacuum Tubes---Vacuum tubes became the standard circuitry element of the first generation of mass-produced computers in the 1950s. UNIAC I, built in 1951 by Remington Rand's Eckert-Mauchly Computer Division, made Remington the leader. In 1953 MIT's Whirlwind introduced magnetic core memory, which became the primary memory for most computers until the mid 70s. In the mid 50s, with UNIVAC I obsolete and UNIVAC II delayed, IBM's 705 (a vacuum tube magnetic core memory machine) established IBM as large-scale computer leader--a lead it never relinquished. But tubebased computers encountered problems of power, temperature, size and especially, tube maintenance.
The Transistor---The solution, the transistor, came from the world of telecommunications. Wartime research had produced new information on normally non-conductive elements germanium and silicon, which become conductors after their atomic structure and a very tiny current is applied. This led to a project begun at Bell Labs in 1946 to develop a "solid state" telephone signal amplifier to replace vacuum tube ones. Bell's John Bardeen, and Walter Brattain, led by William Shockley, invented the point-contact transistor there, in 1947, a turning point in electronics and computer technology.
Commercial viability began with Shockley's cheaper, more reliable junction transistor
...