Friday, May 15, 2009

The dawning of the age of RISC: The 1980s


Advances in process ushered in the "more is more" era of VLSI, leading to true 32-bit architectures. At the same time, the "less is more" RISC philosophy allowed for greater performance. When combined, VLSI and RISC produced chips with awesome capabilities, giving rise to the UNIX® workstation market.

The decade opened with intriguing contemporaneous independent projects at Berkeley and Stanford -- RISC and MIPS. Even with the new RISC families, an industry shakeout commonly referred to as "the microprocessor wars," would mean that we left the 1980s with fewer major micro manufacturers than we had coming in.

By the end of the decade, prices had dropped substantially, so that record numbers of households and schools had access to more computers than ever before.

RISC and MIPS and POWER

RISC, too, started in many places at once, and was antedated by some of the examples already cited (see the sidebar, The evolution of RISC).

Berkeley RISC
In 1980, the University of California at Berkeley started something it called the RISC Project (in fact, the professors leading the project, David Patterson and Carlo H. Sequin, are credited with coining the term "RISC").

The project emphasized pipelining and the use of register windows: by 1982, they had delivered their first processor, called the RISC-I. With only 44KB transistors (compared with about 100KB in most contemporary processors) and only 32 instructions, it outperformed any other single chip design in existence.

MIPS
Meanwhile, in 1981, and just across the San Francisco Bay from Berkeley, John Hennessy and a team at Stanford University started building what would become the first MIPS processor. They wanted to use deep instruction pipelines -- a difficult-to-implement practice -- to increase performance. A major obstacle to pipelining was that it required the hard-to-set-up interlocks in place to ascertain that multiple-clock-cycle instructions would stop the pipeline from loading data until the instruction was completed. The MIPS design settled on a relatively simple demand to eliminate interlocking -- all instructions must take only one clock cycle. This was a potentially useful alteration in the RISC philosophy.

POWER
Also contemporaneously and independently, IBM continued to work on RISC as well. 1974's 801 project turned into Project America and Project Cheetah. Project Cheetah would become the first workstation to use a RISC chip, in 1986: the PC/RT, which used the 801-inspired ROMP chip.

Where are they now?
By 1983, the RISC Project at Berkeley had produced the RISC-II which contained 39 instructions and ran more than 3 times as fast as the RISC-I. Sun Microsystem's SPARC (Scalable Processor ARChitecture) chip design is heavily influenced by the minimalist RISC Project designs of the RISC-I and -II.

Professors Patterson and Sequin are both still at Berkeley.

MIPS was used in Silicon Graphics workstations for years. Although SGI's newest offerings now use Intel processors, MIPS is very popular in embedded applications.

Professor Hennessy left Stanford in 1984 to form MIPS Computers. The company's commercial 32-bit designs implemented the interlocks in hardware. MIPS was purchased by Silicon Graphics, Inc. in 1992, and was spun off as MIPS Technologies, Inc. in 1998. John Hennessy is currently Stanford University's tenth President.

IBM's Cheetah project, which developed into the PC-RT's ROMP, was a bit of a flop, but Project America was in prototype by 1985 and would, in 1990, become RISC System/6000. Its processor would be renamed the POWER1.

RISC was quickly adopted in the industry, and today remains the most popular architecture for processors. During the 1980s, several additional RISC families were launched. Aside from those already mentioned above were:

  • CRISP (C Reduced Instruction Set Processor) from AT&T Bell Labs.
  • The Motorola 88000 family.
  • Digital Equipment Corporation Alpha's (the world's first single-chip 64-bit microprocessor).
  • HP Precision Architecture (HP PA-RISC).

32-bitness

The early 1980s also saw the first 32-bit chips arrive in droves.

BELLMAC-32A
AT&T's Computer Systems division opened its doors in 1980, and by 1981 it had introduced the world's first single-chip 32-bit microprocessor, the AT&T Bell Labs' BELLMAC-32A, (it was renamed the WE 32000 after the break-up in 1984). There were two subsequent generations, the WE 32100 and WE 32200, which were used in:

  • the 3B5 and 3B15 minicomputers
  • the 3B2, the world's first desktop supermicrocomputer
  • the "Companion", the world's first 32-bit laptop computer
  • "Alexander", the world's first book-sized supermicrocomputer

All ran the original Bell Labs UNIX.

Motorola 68010 (and friends)
Motorola had already introduced the MC 68000, which had a 32-bit architecture internally, but a 16-bit pinout externally. It introduced its pure 32-bit microprocessors, the MC 68010, 68012, and 68020 by 1985 or thereabouts, and began to work on a 32-bit family of RISC processors, named 88000.

NS 32032
In 1983, National Semiconductor introduced a 16-bit pinout, 32-bit internal microprocessor called the NS 16032, the full 32-bit NS 32032, and a line of 32-bit industrial OEM microcomputers. Sequent also introduced the first symmetric multiprocessor (SMP) server-class computer using the NS 32032.

Intel entered the 32-bit world in 1981, same as the AT&T BELLMAC chips, with the ill-fated 432. It was a three-chip design rather than a single-chip implementation, and it didn't go anywhere. In 1986, its 32-bit i386 became its first single-chip 32-bit offering, closely followed by the 486 in 1989.

Where are they now?
AT&T closed its Computer Systems division in December, 1995. The company shifted to MIPS and Intel chips.

Sequent's SMP machine faded away, and that company also switched to Intel microprocessors.

The Motorola 88000 design wasn't commercially available until 1990, and was cancelled soon after in favor of Motorola's deal with IBM and Apple to create the first PowerPC.

ARM is born

In 1983, Acorn Computers Ltd. was looking for a processor. Some say that Acorn was refused access to Intel's upcoming 80286 chip, others say that Acorn rejected both the Intel 286 and the Motorola MC 68000 as being not powerful enough. In any case, the company decided to develop its own processor called the Acorn RISC Machine, or ARM. The company had development samples, known as the ARM I by 1985; production models (ARM II) were ready by the following year. The original ARM chip contained only 30,000 transistors.

Where are they now?
Acorn Computers was taken over by Olivetti in 1985, and after a few more shakeups, was purchased by Broadcom in 2000.

However, the company's ARM architecture today accounts for approximately 75% of all 32-bit embedded processors. The most successful implementation has been the ARM7TDMI with hundreds of millions sold in cellular phones. The Digital/ARM combo StrongARM is the basis for the Intel XScale processor.

No comments:

Post a Comment