Science: The future is very, very fast: Parallel supercomputers have jumped out of the lab to support powerful new applications for commercial users, writes John Stansell

Click to follow
The Independent Online
Supercomputers are moving into industry and commerce, after years of specialist use by scientists and engineers modelling the weather, simulating oil reservoirs and designing complex weapon systems. Increasingly they are being used by stockmarket traders, business managers and marketing specialists to extract vital information, very fast, from previously impenetrable masses of data.

But these are not the supercomputers of the Eighties - the vast and expensive Cray Research, Fujitsu or Control Data machines - but small, low-cost, astonishingly fast and powerful computers known as parallel processors. Until this year, parallel machines were the domain of specialist companies, often small by computer industry standards and frequently reliant on venture capital for their existence. Now, however, companies of the stature of IBM, ICL, Cray Research and Intel have launched parallel machines with applications software that meets the needs of commercial users.

They come with many gigabytes (thousands of megabytes) of storage and working memory and run at speeds measured in thousands of millions of floating point operations per second - a performance yardstick known as gigaflops. These parallel computers have two other advantages over conventional - otherwise known as sequential - machines: they are far cheaper, often less than a quarter of the price; and they are 'scalable', meaning that they can be increased in speed and power as needs dictate by adding extra processing modules or 'nodes'.

Almost all today's computers - from the desktop PC to the Meteorological Office's Cray supercomputer are sequential. In other words they process information a little at a time, storing the answers until the job is complete before spitting the solution out. Parallel machines, by contrast, divide up the task, share it out among a number of processors, and combine the results at the end.

There are two main reasons for the sudden emergence of parallel machines from the laboratory. First, they are based on low-cost and widely available microprocessors that conform to open systems standards. Second, they use operating systems such as Unix that allow, in theory at least, applications currently run on conventional computers to be transferred to parallel machines with a minimum of modification.

Supercomputing '93 - the annual industry showcase - held in Portland, Oregon in mid-November, was dominated by parallel machines and by the emerging confidence of industry analysts that 'Parallel is prime time', to quote Howard Richmond of the analysts Gartner Group. Already, IBM, Cray Research and Intel have made inroads into the commercial market, with financial trading systems, decision support software and database management applications.

Colin Upstill, director of the Parallel Applications Centre (PAC) in Southampton, set up under a Department of Trade and Industry initative in 1992, believes that within a couple of years the use of parallel systems will be dominated by database applications. ICL, which launched its Goldrush parallel server for databases last October, agrees with that. Even smaller firms, founded to develop and produce parallel machines for scientific and engineering tasks, are saying they can switch to commercial applications.

Typical is the use of a parallel machine to give traders in financial securities far more information to help close a deal. Prudential Securities in the United States is already using an Intel parallel system that allows dealers to select from up to 200 possible outcomes of an investment decision while in the process of negotiating with a potential buyer. Rover Cars, in Britain, is working with the PAC (which has machines from all the leading vendors) to develop a marketing and manufacturing strategy based on the use of a parallel system that will allow it to guarantee 48-hour manufacture and delivery of a car that has been built to the precise specification laid down by the customer.

Scientific uses within industry for commercial reasons are another growth area. For example, a technology known as '3-D pre-stack seismology' is helping the oil industry to decide more accurately, and far faster, whether a recent oil discovery is likely to become a profitable reservoir. This technology, which in effect shows the geologist a 'slice-by-slice' picture of the earth's crust, is cost effective only if the seismic data are processed on a parallel machine.

Tensor Geophysical, an American exploration company, is using Intel parallel machines on board exploration vessels to do preliminary processing of seismic data before it is sent ashore for full appraisal. When such a vessel detects a potential oil-bearing structure, it orders a parallel computer to be shipped out. Within half a day, says Tensor's Walter Lynn, the machine is in use, and helping to minimise the time spent on analysing the potential size of a reservoir.

While 1993 was the year that saw the entry of 'mainstream' computer firms into the parallel processing arena, most analysts and company executives believe that it will take a further year at least before the new computers really start to take off. Justin Rattner, Intel's director of technology, believes that 1995 will be the first year of real growth. Irving Wladawsky-Berger, IBM's head of parallel systems, says that by the end of this year the new- style supercomputers will be coming into use in databases, file servers and transaction processing.

Steve Conway, of Cray Research, believes progress will be slower, mostly because of the large installed base of applications software in commercial organisations that will only run on conventional computers. However, he believes parallel processing is ideal for some applications - indeed, will make them possible - which is why his company, the largest maker of conventional supercomputers in the world, has moved into the parallel market.