Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Small wonder

What would we do without the microchip? In the 25 years since its invention, it has changed all our lives. And it's still developing rapidly as components reach the theoretical minimum size. What does the future hold? By Ian Grayson

Ian Grayson
Monday 18 November 1996 00:02 GMT
Comments

They're everywhere. Deep inside personal computers, mobile phones and videorecorders they sit, ready to obey our every command. They happily control household appliances such as toasters, microwave ovens and dishwashers, and every time we board a plane or drive a late-model car we entrust our lives to them. Yet just 25 years ago, they didn't exist at all.

Microprocessors, the tiny brains that control so many facets of Nineties life, first appeared in November 1971. Hailed as a major breakthrough in micro-engineering, they were able to perform tasks that traditionally had required devices of far greater size and complexity. They brought affordable computing power to the masses, rather than just the select few.

Essentially, a microprocessor is a collection of thousands or millions of tiny on/off switches contained on a chip of silicon. These switches, called transistors, manipulate electrical power in certain patterns. The patterns, determined by software, are the basis on which a multitude of tasks are performed, from cooking a chicken to launching a space shuttle mission.

The significance of the microprocessor's invention has been compared with that of steam power in the 18th century and electricity in the 19th century - a bold statement, but given credence by that fact that the market for such devices has grown from nothing to around $130bn-worth last year. Industry estimates put this figure at $250bn by the year 2000.

The first microprocessor, built by the Intel Corporation, contained 2,300 transistors. It packed as much number-crunching power as the first electronic computer - ENIAC, but rather than taking up the space of a small room, it was contained on a chip the size of a thumbnail.

Amazing as it was, the development of this chip was a risk for Intel because there was no sure market for multi-purpose microprocessors. Technology had not progressed beyond integrated circuit (IC) chips, which traditionally had been developed for a specific task and for a specific customer.

Keith Chapple, Intel Corporation UK's managing director, says no one predicted the potential market for the new device.

"It was seen in the early days as being used for controlling things such as traffic lights and calculators, but the PC itself was not foreseen at that time," he recalls. "People were still ... thinking of the computer as a mainframe product, and that the total world demand for them [PCs] would be small."

However, as companies began to realise the potential uses for microprocessors, development and manufacture gained pace. Intel followed its first chip, code-named the 4004, with a second chip just 12 months later. The 8008 chip, with some 3,500 transistors, was twice as powerful as its predecessor.

A major evolutionary leap came with the next product from Intel's laboratories. The 8080 chip, launched in 1974, was the brain of the world's first personal computer - the Altair. Within months, tens of thousands of people had purchased kits enabling them to build their own machines. The PC revolution had begun. In 1978, when IBM chose the fourth incarnation of Intel's chip, the 8086, to power its new product - the IBM PC, this move created an industry standard and, together with Microsoft's Disk Operating System (DOS) software, formed the basis for today's personal computers.

Chip development continued to gain pace. Other manufacturers, most notably Motorola, released their own designs, which quickly gained market acceptance. But the PC market remained in Intel's grasp. The company produced the 286 microprocessor in 1982, followed it with the 386 chip in 1985 and rolled out the 486 chip for the first time in 1989.

Exponential growth had come into play. It was observed that the capacity of chips doubled every 18 to 24 months. This trend, known as "Moore's law", after Intel's founder, Gordon Moore, has continued ever since and shows no sign of abating. Put another way, every two years there is as much computing power manufactured as previously existed on the entire planet.

According to Dr Gordon Bennett, corporate vice-president and general manager of Motorola SPS, this is due to the "incredible shrinking transistor".

"You can go back in time to when you used to have only tens or hundreds on a chip," Dr Bennett says, "but during the past two decades this has been taken to millions. It's got to the stage where we are talking about widths on a chip of 0.18 microns - bacteria are eight microns wide." Dr Bennett recalls that everyone talked about hitting barriers, in terms of cramming more and more on to a single chip, but that barrier has yet to be reached.

Mr Chapple agrees: "We certainly see Moore's law continuing for the next 10 to 15 years, which is about as far ahead as we can see. When you get to that stage you are looking at the line-width in use on chips being just a few atoms wide. Where things go after that is not yet clear."

Such a rapidly evolving market has attracted more players keen for a piece of the chip action. The second-largest supplier of Intel-compatible microprocessors for PCs is US-based AMD, founded in 1969. Also gaining ground is Cyrix Corporation, founded in 1988, while Advanced RISC Machines (ARM), established by Acorn Computers and Apple Computer, focuses on development of Reduced Instruction Set Computing (RISC) chips, used in many Macintosh computers.

In the early Nineties, microprocessor vendors began to recognise that a new market was developing, a market that offered as much growth potential as the corporate sector had during the Eighties: the home.

In 1993, with this firmly in mind, Intel moved away from the numbering system it had used to identify previous chip generations and christened its latest achievement the Pentium. Backed with a multi-million-dollar marketing drive, Pentium became a byword for PC power and performance and brought microprocessors to the public's attention for the first time.

Charting the future of the microprocessor is no easy task, and few industry watchers are keen to predict events more than a couple of years ahead. The latest generation now contains more than 5 million transistors and executes millions of operations per second - a far cry from the original designs of the early Seventies.

As Intel's Mr Moore once said: "If the auto industry advanced as rapidly as the semiconductor industry, a Rolls-Royce engine would get half a million miles to the gallon, and would be cheaper to throw away than to park."

One point is agreed by all: the impact of microprocessors is growing, and is touching all facets of modern life.

"We have already surrounded ourselves with the ubiquitous chip," says Dr Bennett. "In the Seventies it was small, in the Eighties it started to grow and in the Nineties it has exploded. Now you're surrounded by them"

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in