Who are we? And where do we come from?" They are the two greatest questions that have ever taxed mankind, but despite the extraordinary march of science, the answers remain elusive.
Now, where the human brain has failed, a global "computer brain" is set to triumph. Some herald it as a spur to world scientific cooperation, others see it as deeply sinister.
Chipping away at the frontiers of those big questions are two bran-ches of scientific research: biotechnology and particle physics. The first hopes to unlock all the secrets of the human genome, the second to prove the "big bang" theory and describe how the universe began.
But these disciplines have hit a technological brick wall. Both produce a formidable volume of data, and as that has swollen, the computer power needed to deal with it has mushroomed. To meet this demand, companies such as IBM and Compaq have built ever more powerful machines. The "Deep Blue" computer that defeated Garry Kasparov at chess has been left behind – machines now have hundreds of times more processing muscle.
But even the most powerful networks still fall 10,000 times short of what is needed. The solution lies in linking computers to form a "Grid" of such processing power that it can cope with the vast supply of data of the future. The Grid will become the next generation of the internet.
That is what IBM and a consortium of research centres are now trying to build – a network linking supercomputers around the world to set them all working on the same problem and have them speaking the same language. The scientific community says it will be like setting a single mega-brain on to cracking the secrets of life.
The Grid could resemble the seti@home project which sets home computers around the world on to the job of interpreting signals from outer space in the hope of finding intelligent life. It would enable thousands of scientists to share their resources. In the UK, the idea is being embraced by the Government and by scientists in particle physics. Their aim is to build the Grid by 2005 – the year when a new particle accelerator, the Large Hadron Collider, comes on stream at Cern, the high-energy particle physics laboratory in Switzerland.
The new collider is being built to prove the existence of a crucial sub-atomic particle discovered by Professor Peter Higgs at Edinburgh University in 1964. The Higgs Boson particle holds the key to the universe: it gives matter its mass. However, the LHC will generate millions of billions of bits of information in its searches each year. This is far more than existing supercomputers can handle.
The Grid would allow all that data to be chopped up and sent out for number-crunching by research computers – and, in principle, people's PCs at home – and then be re-formed to produce the answers the scientists want. It would short cut a process that would otherwise take billions of hours to calculate.
At the centre of this system will be a gigantic supercomputer – by far the biggest ever built – with a storage capacity equal to 146 novels. At the processing end of things, the computers making up the network will be "Linux clusters" – the combined computing power of the "brain" will be able to do 13 thousand billion calculations a second.
Yet the biggest problem is whether the Grid will actually work. Will it be faultless or open to the glitches that every computer user suffers? As the Government's Particle Physics and Astronomy Research Council (PPARC) delicately puts it, Cern is "very hesitant" about putting millions of dollars worth of its data on to the Grid until it is sure it works perfectly. Biotech firms and geneticists hoping to fully exploit the huge quantity of data generated by the Humane Genome Project have the same concerns.
"The question is whether the network between your computer and my computer is robust enough to ensure that when we send a message, it arrives unscathed – no mistakes, no glitches," says PPARC's chief executive, Ian Halliday.
His concerns highlight the biggest challenge for the Grid– ensuring that all computers are 100 per cent compatible. In sharing out that precious load of data among thousands of different machines, you have to ensure it all fits together seamlessly when it comes back.
And to make life even more interesting, a little-acknowledged race has developed as corporations and computer scientists race to crack that problem. Will the Grid evolve as a democratic technology or the creature of global corporations?
The internet was originally set up to serve the academic community, letting researchers exchange information efficiently. As it developed, its sprawling, free-for-all nature created the feeling of a "virtual democracy" run by no one and controlled from nowhere.
The ideal for Professor Halliday is the visionary work of Tim Berners Lee, a Cern scientist. In 1990, he created the internet protocol HTML to allow different universities with incompatible computer systems to communicate freely. At Cern, a new generation of computer scientists hopes to emulate him by finding that universal language which is free for all to use. With the new collider due to spew out data in less than four years, Prof Halliday believes they have the greatest reason to get the Grid right.
However, ever since the web caught on, the likes of Microsoft and Oracle have looked to wield the same power that they do over operating systems and software. The Grid gives them the chance to control the new internet from the start.
IBM, though, says its involvement is essentially altruistic. It portrays itself as the champion of a free "open source" solution to the Grid problem – in contrast to Microsoft's commercially driven efforts.
At a conference in June, IBM and Sun announced the construction of Globus, a universal language which would meet the huge data requirements of the biotechnology industry. And earlier this month IBM announced the construction of the first prototype Grid in Britain, linking eight universities into a new super-internet. Daron Green, the head of IBM's Grid development in Europe, says the core technology in Globus will remain free and open to all. He also claims it will become commercially available in six to eight months.
But Prof Halliday thinks that altruism is based on self-interest. "The general idea is sufficiently powerful that all computer people are going to be interested in pursuing it. So there's undoubtedly an element of competition. But I don't think its cut-throat because there's enough uncertainty that IBM will want to see what we're doing, and it will be in its interests to let academics see what it's up to so we can share information."
The "brain" concept has also given rise to fears of a huge global computer with the sort of powers suggested by films like The Matrix, where humans create computers capable of replacing the human race. Science fiction it may be, but Billy Joy, the inventor of the web language Java and chief scientist at Sun, recently said: "It is no exaggeration to say we are on the cusp of the further perfection of extreme evil."Reuse content