At the dawn of the 19th century, Benjamin Franklin's discovery of the principles of electricity were still fairly new, and practical applications of his discoveries were few -- the most notable exception being the lightning rod, which was invented independently by two different people in two different places. Independent contemporaneous (and not so contemporaneous) discovery would remain a recurring theme in electronics.
So it was with the invention of the vacuum tube -- invented by Fleming, who was investigating the Effect named for and discovered by Edison; it was refined four years later by de Forest (but is now rumored to have been invented 20 years prior by Tesla). So it was with the transistor: Shockley, Brattain and Bardeen were awarded the Nobel Prize for turning de Forest's triode into a solid state device -- but they were not awarded a patent, because of 20-year-prior art by Lilienfeld. So it was with the integrated circuit (or IC) for which Jack Kilby was awarded a Nobel Prize, but which was contemporaneously developed by Robert Noyce of Fairchild Semiconductor (who got the patent). And so it was, indeed, with the microprocessor.
Just a scant few years after the first laboratory integrated circuits, Fairchild Semiconductor introduced the first commercially available integrated circuit (although at almost the same time as one from Texas Instruments).
Already at the start of the decade, process that would last until the present day was available: commercial ICs made in the planar process were available from both Fairchild Semiconductor and Texas Instruments by 1961, and TTL (transistor-transistor logic) circuits appeared commercially in 1962. By 1968, CMOS (complementary metal oxide semiconductor) hit the market. There is no doubt but that technology, design, and process were rapidly evolving.
Observing this trend, Fairchild Semiconductor's director of Research & Development Gordon Moore observed in 1965 that the density of elements in ICs was doubling annually, and predicted that the trend would continue for the next ten years. With certain amendments, this came to be known as Moore's Law.
The first ICs contained just a few transistors per wafer; by the dawn of the 1970s, production techniques allowed for thousands of transistors per wafer. It was only a matter of time before someone would use this capacity to put an entire computer on a chip, and several someones, indeed, did just that.
Development explosion: The 1970s
The idea of a computer on a single chip had been described in the literature as far back as 1952 (see Resources), and more articles like this began to appear as the 1970s dawned. Finally, process had caught up to thinking, and the computer on a chip was made possible. The air was electric with the possibility.
Once the feat had been established, the rest of the decade saw a proliferation of companies old and new getting into the semiconductor business, as well as the first personal computers, the first arcade games, and even the first home video game systems -- thus spreading consumer contact with electronics, and paving the way for continued rapid growth in the 1980s.
At the beginning of the 1970s, microprocessors had not yet been introduced. By the end of the decade, a saturated market led to price wars, and many processors were already 16-bit.
At the time of this writing, three groups lay claim for having been the first to put a computer in a chip: The Central Air Data Computer (CADC), the Intel® 4004, and the Texas Instruments TMS 1000.
The CADC system was completed for the Navy's "TomCat" fighter jets in 1970. It is often discounted because it was a chip set and not a CPU. The TI TMS 1000 was first to market in calculator form, but not in stand-alone form -- that distinction goes to the Intel 4004, which is just one of the reasons it is often cited as the first (incidentally, it too was just one in a chipset of four).
In truth, it does not matter who was first. As with the lightning rod, the light bulb, radio -- and so many other innovations before and after -- it suffices to say it was in the aether, it was inevitable, its time was come.
Where are they now?
CADC spent 20 years in top-secret, cold-war-era mothballs until finally being declassified in 1998. Thus, even if it was the first, it has remained under most people's radar even today, and did not have a chance to influence other early microprocessor design.
The Intel 4004 had a short and mostly uneventful history, to be superseded by the 8008 and other early Intel chips (see below).
In 1973, Texas Instrument's Gary Boone was awarded U.S. Patent No. 3,757,306 for the single-chip microprocessor architecture. The chip was finally marketed in stand-alone form in 1974, for the low, low (bulk) price of US$2 apiece. In 1978, a special version of the TI TMS 1000 was the brains of the educational "Speak and Spell" toy which E.T. jerry-rigged to phone home.
Early Intel: 4004, 8008, and 8080
Intel released its single 4-bit all-purpose chip, the Intel 4004, in November 1971. It had a clock speed of 108KHz and 2,300 transistors with ports for ROM, RAM, and I/O. Originally designed for use in a calculator, Intel had to renegotiate its contract to be able to market it as a stand-alone processor. Its ISA had been inspired by the DEC PDP-8.
The Intel 8008 was introduced in April 1972, and didn't make much of a splash, being more or less an 8-bit 4004. Its primary claim to fame is that its ISA -- provided by Computer Terminal Corporation (CTC), who had commissioned the chip -- was to form the basis for the 8080, as well as for the later 8086 (and hence the x86) architecture. Lesser-known Intels from this time include the nearly forgotten 4040, which added logical and compare instructions to the 4004, and the ill-fated 32-bit Intel 432.
Intel put itself back on the map with the 8080, which used the same instruction set as the earlier 8008 and is generally considered to be the first truly usable microprocessor. The 8080 had a 16-bit address bus and an 8-bit data bus, a 16-bit stack pointer to memory which replaced the 8-level internal stack of the 8008, and a 16-bit program counter. It also contained 256 I/O ports, so I/O devices could be connected without taking away or interfering with the addressing space. It also possessed a signal pin that allowed the stack to occupy a separate bank of memory. These features are what made this a truly modern microprocessor. It was used in the Altair 8800, one of the first renowned personal computers (other claimants to that title include the 1963 MIT Lincoln Labs' 12-bit LINC/Laboratory Instruments Computer built with DEC components and DEC's own 1965 PDP-8).
Although the 4004 had been the company's first, it was really the 8080 that clinched its future -- this was immediately apparent, and in fact in 1974 the company changed its phone number so that the last four digits would be 8080.
Where is Intel now?
Last time we checked, Intel was still around.
In 1974, RCA released the 1802 8-bit processor with a different architecture than other 8-bit processors. It had a register file of 16 registers of 16 bits each and using the SEP instruction, you could select any of the registers to be the program counter. Using the SEP instruction, you could choose any of the registers to be the index register. It did not have standard subroutine CALL immediate and RET instructions, though they could be emulated.
A few commonly used subroutines could be called quickly by keeping their address in one of the 16 registers. Before a subroutine returned, it jumped to the location immediately preceding its entry point so that after the RET instruction returned control to the caller, the register would be pointing to the right value for next time. An interesting variation was to have two or more subroutines in a ring so that they were called in round-robin order.
The RCA 1802 is considered one of the first RISC chips although others (notably Seymore Cray -- see the sidebar, The evolution of RISC -- had used concepts already).
Where is it now?
Sadly, the RCA chip was a spectacular market failure due to its slow clock cycle speed. But it could be fabricated to be radiation resistant, so it was used on the Voyager 1, Viking, and Galileo space probes (where rapidly executed commands aren't a necessity).
In 1975, IBM® produced some of the earliest efforts to build a microprocessor based on RISC design principles (although it wasn't called RISC yet -- see the sidebar, The evolution of RISC). Initially a research effort led by John Cocke (the father of RISC), many say that the IBM 801 was named after the address of the building where the chip was designed -- but we suspect that the IBM systems already numbered 601 and 701 had at least something to do with it also.
Where is the 801 now?The 801 chip family never saw mainstream use, and was primarily used in other IBM hardware. Even though the 801 never went far, it did inspire further work which would converge, fifteen years later, to produce the Power Architecture™ family
In 1975, Motorola introduced the 6800, a chip with 78 instructions and probably the first microprocessor with an index register.
Two things are of significance here. One is the use of the index register which is a processor register (a small amount of fast computer memory that's used to speed the execution of programs by providing quick access to commonly used values). The index register can modify operand addresses during the run of a program, typically while doing vector/array operations. Before the invention of index registers and without indirect addressing, array operations had to be performed either by linearly repeating program code for each array element or by using self-modifying code techniques. Both of these methods harbor significant disadvantages when it comes to program flexibility and maintenance and more importantly, they are wasteful when it comes to using up scarce computer memory.
Where is the 6800 now?
Many Motorola stand-alone processors and microcontrollers trace their lineage to the 6800, including the popular and powerful 6809 of 1979
Soon after Motorola released the 6800, the company's design team quit en masse and formed their own company, MOS Technology. They quickly developed the MOS 6501, a completely new design that was nevertheless pin-compatible with the 6800. Motorola sued, and MOS agreed to halt production. The company then released the MOS 6502, which differed from the 6501 only in the pin-out arrangement.
The MOS 6502 was released in September 1975, and it sold for US$25 per unit. At the time, the Intel 8080 and the Motorola 6800 were selling for US$179. Many people thought this must be some sort of scam. Eventually, Intel and Motorola dropped their prices to US$79. This had the effect of legitimizing the MOS 6502, and they began selling by the hundreds. The 6502 was a staple in the Apple® II and various Commodore and Atari computers.
Where is the MOS 6502 now?
Many of the original MOS 6502 still have loving homes today, in the hands of collectors (or even the original owners) of machines like the Atari 2600 video game console, Apple II family of computers, the first Nintendo Entertainment System, the Commodore 64 -- all of which used the 6502. MOS 6502 processors are still being manufactured today for use in embedded systems.
Advanced Micro Devices (AMD) was founded in 1969 by Jerry Sanders. Like so many of the people who were influential in the early days of the microprocessor (including the founders of Intel), Sanders came from Fairchild Semiconductor. AMD's business was not the creation of new products; it concentrated on making higher quality versions of existing products under license. For example, all of its products met MILSPEC requirements no matter what the end market was. In 1975, it began selling reverse-engineered clones of the Intel 8080 processor.
Where is AMD now?
In the 1980s, first licensing agreements -- and then legal disputes -- with Intel, eventually led to court validation of clean-room reverse engineering and opened the 1990s floodgates to many clone corps.
No comments:
Post a Comment