Modern Computers


From UNIVAC to the latest desktop PCs, computer evolution has moved very rapidly. The first-generation computers were known for using vacuum tubes in their construction. The generation to follow would use the much smaller and more efficient transistor.

From Tubes to Transistors

Any modern digital computer is largely a collection of electronic switches. These switches are used to represent and control the routing of data elements called binary digits (or bits). Because of the on or off nature of the binary information and signal routing the computer uses, an efficient electronic switch was required. The first electronic computers used vacuum tubes as switches, and although the tubes worked, they had many problems.

The type of tube used in early computers was called a triode and was invented by Lee De Forest in 1906 (see Figure 1.1). It consists of a cathode and a plate, separated by a control grid, suspended in a glass vacuum tube. The cathode is heated by a red-hot electric filament, which causes it to emit electrons that are attracted to the plate. The control grid in the middle can control this flow of electrons. By making it negative, the electrons are repelled back to the cathode; by making it positive, they are attracted toward the plate. Thus, by controlling the grid current, you can control the on/off output of the plate.

Figure 1.1. The three main components of a basic triode vacuum tube.


Unfortunately, the tube was inefficient as a switch. It consumed a great deal of electrical power and gave off enormous heata significant problem in the earlier systems. Primarily because of the heat they generated, tubes were notoriously unreliablein larger systems, one failed every couple of hours or so.

The invention of the transistor, or semiconductor, was one of the most important developments leading to the personal computer revolution. The transistor was first invented in 1947 and announced in 1948 by Bell Laboratory engineers John Bardeen and Walter Brattain. Bell associate William Shockley invented the junction transistor a few months later, and all three jointly shared the Nobel Prize in Physics in 1956 for inventing the transistor. The transistor, which essentially functions as a solid-state electronic switch, replaced the less-suitable vacuum tube. Because the transistor was so much smaller and consumed significantly less power, a computer system built with transistors was also much smaller, faster, and more efficient than a computer system built with vacuum tubes.

Transistors are made primarily from the elements silicon and germanium, with certain impurities added. Depending on the impurities addedits electron contentthe material becomes known as either N-Type (negative) or P-Type (positive). Both types are conductors, allowing electricity to flow in either direction. However, when the two types are joined, a barrier is formed where they meet that allows current to flow in only one direction when a voltage is present in the right polarity. This is why they are typically called semiconductors.

A transistor is made by placing two P-N junctions back to back. They are made by sandwiching a thin wafer of one type of semiconductor material between two wafers of the other type. If the wafer in between is made from P-type material, the transistor is designated an NPN. If the wafer in between is N-type, the transistor is designated PNP.

In an NPN transistor, the N-type semiconductor material on one side of the wafer is called the emitter (or source) and is normally connected to a negative current (see Figure 1.2). The P-type material in the center is called the base, and the N-type material on the other side of the base is called the collector (or drain).

Figure 1.2. Elements of an NPN transistor.


An NPN transistor compares to a triode tube such that the emitter is equivalent to the cathode, the base is equivalent to the grid, and the collector is equivalent to the plate. By controlling the current at the base, you can control the flow of current between the emitter and collector.

Compared to the tube, the transistor is much more efficient as a switch and can be miniaturized to microscopic scale. In June 2001, Intel researchers unveiled the world's smallest and fastest silicon transistors, only 20 nanometers (billionths of a meter) in size. These are expected to appear in PC processors in the year 2007, which will have one billion transistors running at speeds of 20GHz! By comparison, in 2005 the AMD Athlon 64 X2 dual-core processor had more than 233 million transistors, and the Pentium Extreme Edition dual-core processor had more than 230 million transistors.

The conversion from tubes to transistors began the trend toward miniaturization that continues to this day. Today's small laptop (or palmtop) PC and even Tablet PC systems, which run on batteries, have more computing power than many earlier systems that filled rooms and consumed huge amounts of electrical power.

Although vacuum tubes have been replaced in virtually all consumer applications by transistors and integrated circuits, they remain popular for high-end audio applications because they produce a warmer and richer sound than transistors do. To capitalize on this effect, Acer's Aopen division once released a series of motherboards (the AX4B-533 Tube, for example) designed for audio-philes that used a dual-triode tube along with a special noise-reduction design to produce excellent music playback. Acer's Aopen division has released a motherboard (the AX4B-533 Tube) that uses a dual-triode tube along with a special noise-reduction design to produce excellent music playback.

Integrated Circuits

The third generation of modern computers is known for using integrated circuits instead of individual transistors. In 1959, engineers at Texas Instruments invented the integrated circuit (IC), a semiconductor circuit that contains more than one transistor on the same base (or substrate material) and connects the transistors without wires. The first IC contained only six transistors. By comparison, the AMD Athlon 64 X2 dual-core processor has more than 233 million transistors! Today, many ICs have transistor counts in the multimillion range.

The First Microprocessor

Intel was founded on July 18, 1968 (as N M Electronics) by Robert Noyce and Gordon Moore. Almost immediately they changed the company name to Intel and were joined by cofounder Andrew Grove. They had a specific goal: to make semiconductor memory practical and affordable. This was not a given at the time, considering that silicon chip-based memory was at least 100 times more expensive than the magnetic core memory commonly used in those days. At the time, semiconductor memory was going for about a dollar a bit, whereas core memory was about a penny a bit. Noyce said, "All we had to do was reduce the cost by a factor of a hundred; then we'd have the market; and that's basically what we did."

By 1970, Intel was known as a successful memory chip company, having introduced a 1Kb memory chip much larger than anything else available at the time. (1Kb equals 1,024 bits, and a byte equals 8 bits. This chip, therefore, stored only 128 bytesnot much by today's standards.) Known as the 1103 dynamic random access memory (DRAM), it became the world's largest-selling semiconductor device by the end of the following year. By this time, Intel had also grown from the core founders and a handful of others to more than 100 employees.

Because of Intel's success in memory chip manufacturing and design, Japanese manufacturer Busicom asked Intel to design a set of chips for a family of high-performance programmable calculators. At the time, all logic chips were custom designed for each application or product. Because most chips had to be custom designed specific to a particular application, no one chip could have any widespread usage.

Busicom's original design for its calculator called for at least 12 custom chips. Intel engineer Ted Hoff rejected the unwieldy proposal and instead designed a single-chip, general-purpose logic device that retrieved its application instructions from semiconductor memory. As the core of a four-chip set, a program could control this central processing unit and essentially tailor its function to the task at hand. The chip was generic in nature, meaning it could function in designs other than calculators. Previous designs were hard-wired for one purpose, with built-in instructions; this chip would read a variable set of instructions from memory, which would control the function of the chip. The idea was to design, on a single chip, almost an entire computing device that could perform various functions, depending on which instructions it was given.

There was one problem with the new chip: Busicom owned the rights to it. Hoff and others knew that the product had almost limitless application, bringing intelligence to a host of "dumb" machines. They urged Intel to repurchase the rights to the product. While Intel founders Gordon Moore and Robert Noyce championed the new chip, others within the company were concerned that the product would distract Intel from its main focusmaking memory. They were finally convinced by the fact that every four-chip microcomputer set included two memory chips. As the director of marketing at the time recalled, "Originally, I think we saw it as a way to sell more memories, and we were willing to make the investment on that basis."

Intel offered to return Busicom's $60,000 investment in exchange for the rights to the product. Struggling with financial troubles, the Japanese company agreed. Nobody else in the industry at the time, even at Intel, realized the significance of this deal. Of course, it paved the way for Intel's future in processors. The result was the 1971 introduction of the 4-bit Intel 4004 microcomputer set (the term microprocessor was not coined until later). Smaller than a thumbnail and packing 2,300 transistors with 10-micron (millionth of a meter) spacing, the $200 chip delivered as much computing power as one of the first electronic computers, ENIAC. By comparison, ENIAC relied on 18,000 vacuum tubes packed into 3,000 cubic feet (85 cubic meters) when it was built in 1946. The 4004 ran at 108KHz (just over one tenth of 1MHz) and executed 60,000 operations in 1 secondprimitive by today's standards, but a major breakthrough at the time.

Intel introduced the 8008 microcomputer in 1972, which processed 8 bits of information at a time, twice as much as the original chip. By 1981, Intel's microprocessor family had grown to include the 16-bit 8086 and the 8-bit 8088 processors. These two chips garnered an unprecedented 2,500 design wins in a single year. Among those designs was a product from IBM that was to become the first PC.

Note

The term PC defines a type of personal computer using Intel architecture processors and loosely based on the original IBM PC, XT, and AT designs. Other types of personal computers existed before the PC, but what we call PCs today have dominated the market since they were first introduced in 1981.


In 1982, Intel introduced the 286 chip. With 134,000 transistors, it provided about three times the performance of other 16-bit processors of the time. Featuring on-chip memory management, the 286 also offered software compatibility with its predecessors. This revolutionary chip was first used in IBM's benchmark PC-AT, the system upon which all modern PCs are based.

In 1985 came the Intel 386 processor. With a new 32-bit architecture and 275,000 transistors, the chip could perform more than five million instructions every second (MIPS). Compaq's DESKPRO 386 was the first PC based on the new microprocessor.

Next out of the gate was the Intel 486 processor in 1989. The 486 had 1.2 million transistors and the first built-in math coprocessor. It was some 50 times faster than the original 4004, equaling the performance of some mainframe computers.

Then, in 1993, Intel introduced the first P5 family (586) processor, called the Pentium, setting new performance standards with several times the performance of the previous 486 processor. The Pentium processor used 3.1 million transistors to perform up to 90 MIPSnow up to about 1,500 times the speed of the original 4004.

Note

Intel's change from using numbers (386/486) to names (Pentium/Pentium Pro) for its processors was based on the fact that it could not secure a registered trademark on a number and therefore could not prevent its competitors from using those same numbers on clone chip designs.


The first processor in the P6 (686) family, called the Pentium Pro processor, was introduced in 1995. With 5.5 million transistors, it was the first to be packaged with a second die containing high-speed L2 memory cache to accelerate performance.

Intel revised the original P6 (686/Pentium Pro) and introduced the Pentium II processor in May 1997. Pentium II processors had 7.5 million transistors packed into a cartridge rather than a conventional chip, allowing them to attach the L2 cache chips directly on the module. The Pentium II family was augmented in April 1998, with both the low-cost Celeron processor for basic PCs and the high-end Pentium II Xeon processor for servers and workstations. Intel followed with the Pentium III in 1999, essentially a Pentium II with Streaming SIMD Extensions (SSE) added.

Around the time the Pentium was establishing its dominance, AMD acquired NexGen, who had been working on its Nx686 processor. AMD incorporated that design along with a Pentium interface into what would be called the AMD K6. The K6 was both hardware and software compatible with the Pentium, meaning it plugged into the same Socket 7 and could run the same programs. As Intel dropped its Pentium in favor of the Pentium II and III, AMD continued making faster versions of the K6 and made huge inroads in the low-end PC market.

During 1998, Intel became the first to integrate L2 cache directly on the processor die (running at the full speed of the processor core), dramatically increasing performance. This was first done on the second-generation Celeron processor (based on the Pentium II core), as well as the Pentium IIPE (performance-enhanced) chip used only in notebook systems. The first high-end desktop PC chip with on-die full-core speed L2 cache was the second-generation (Coppermine core) Pentium III introduced in late 1999. After this, all the major processor manufacturers also integrated the L2 cache on the processor die, a trend that continues today.

AMD introduced the Athlon in 1999 to compete with Intel head to head in the high-end desktop PC market. The Athlon became very successful, and it seemed for the first time that Intel had some real competition in the higher-end systems. In hindsight the success of the Athlon might be easy to see, but at the time they were introduced, their success was anything but assured. Unlike the previous K6 chips, which were both hardware- and software-compatible with Intel processors, the Athlon was only software-compatible and required a motherboard with an Athlon supporting chipset and processor socket.

The year 2000 saw both companies introduce more new chips to the market. AMD premiered both its Athlon Thunderbird and Duron processors. The Duron is essentially an Athlon with a smaller L2 cache designed for lower-cost systems, whereas the Thunderbird uses a more integrated on-die cache to ratchet up the Athlon's performance. The Duron is a lower-cost chip primarily targeted as competition for Intel's lower-cost Celeron processors.

Intel introduced the Pentium 4 in late 2000, the latest processor in the Intel Architecture 32-bit (IA-32) family. It also announced the Itanium processor (code named Merced), which is the first IA-64 (Intel Architecture-64 bit) processor. Itanium is Intel's first processor with 64-bit instructions and is opening a whole new category of operating systems and applications while still remaining backward compatible with 32-bit software.

2000 also saw another significant milestone written into the history books when both Intel and AMD crossed the 1GHz barrier, a speed that many thought could never be accomplished.

See "Intel Itanium and Itanium 2," p. 193.


In 2001, Intel introduced a Pentium 4 version running at 2GHz, the first PC processor to achieve that speed. AMD also introduced the Athlon XP, based on its newer Palomino core, as well as the Athlon MP, designed for multiprocessor server systems. During 2001, both AMD and Intel continued to increase the speed of their chips and enhance the existing Pentium III/Celeron, Pentium 4, and Athlon/Duron processors.

In 2002, Intel released a Pentium 4 version running at 3.06GHz, the first PC processor to achieve that speed. This and subsequent 3GHz+ processors feature Intel's hyper-threading (HT) technology, which turns the processor into a virtual dual-processor configuration. By running two application threads at the same time, HT-enabled processors can perform tasks at speeds 25%40% faster than non-HT-enabled processors can. HT technology is also compatible with Windows XP Home Edition, which doesn't support dual-processor motherboards.

In 2003, AMD released its first 64-bit processor: the Athlon 64 (previously code named ClawHammer or K8). Unlike Intel's first 64-bit processors, the server-oriented Itanium and Itanium 2, which are optimized for a new 64-bit architecture and are relatively slow at running 32-bit x86 instructions used by conventional processors, the Athlon 64 is a 64-bit extension of the x86 family typified by the Athlon, Pentium 4, and earlier processors. Thus, the Athlon 64 runs 32-bit software as quickly as it runs 64-bit software. Intel followed with the Pentium 4 Extreme Edition, the first consumer-level processor that incorporated L3 cache. The whopping 2MB of cache added greatly to the transistor count as well as performance.

In 2005, both Intel and AMD released their first dual-core processors, basically integrating two processors into a single chip. Although boards supporting two or more processors had been commonly used in network servers for many years prior, this brought dual-CPU capabilities in an affordable package to standard PCs. Rather than attempting to increase clock rates, as has been done in the past, adding processing power by integrating two or more processors into a single chip will enable future processors to perform more work with fewer bottlenecks and with a reduction in both power consumption and heat production.




Upgrading and Repairing PCs
Upgrading and Repairing PCs (17th Edition)
ISBN: 0789734044
EAN: 2147483647
Year: 2006
Pages: 283
Authors: Scott Mueller

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net