The ultimate global network

Within 20 years computers will be everywhere, and they'll all be talking to each other. Daunting? Not if we're prepared, says a group of British scientists. Richard Sarson reports
Click to follow
The Independent Online

You may think that computers have taken over most aspects of your life. But you ain't seen nothing yet. It's predicted that in 20 years' time, "there will be 100 times more computers, each 100 times more powerful than now. They will be everywhere - in toys, in books, in clothes, under our skin, in chairs, in pills. They will be connected by wireless at 100 times today's speed. Their programs, however, will still be hand-crafted, and subject to continuous change." That's the view of Sir Tony Hoare, previously Professor of Computer Science at Oxford University and now a researcher at Microsoft's Research Lab in Cambridge. He wants fellow computer scientists to think about how to handle this scary new world.

"We have already almost lost the internet to viruses which can cause chaotic change," says Professor Robin Milner, of the Cambridge University Computer Laboratory. "As computers proliferate, we become even more vulnerable to the damage that the microcosm can do to the macrocosm."

To ward off these evils and prepare for the future, Hoare and Milner are launching a series of "Grand Challenges" to the UK's computer scientists. The seven challenges spin off in different directions from a single big idea: that all the computers in the world will become part of one Global Ubiquitous Computer. Hoare wants "to understand these enormous artefacts, which have rather escaped the control of their original designers. At one time, the complexity may have been artificial, but now it is almost natural, rather like the complexity of organic chemistry."

One challenge explores the engineering design principles, compromises and the technological tricks that are essential in building the Global Ubiquitous Computer. According to Jon Crowcroft of Cambridge, "It hopes to see a set of engineering rules of thumb maturing into design principles, which can then be applied to other systems." Another challenge, "journeys in non-classical computation", will hope to find newer and more appropriate devices than today's computers to run such man-made organisms.

A grand challenge on "dependability" aims to answer more mundane questions of the soundness and security of programs running in homes, offices, cars, planes, rockets, etc. This covers, according to Hoare, "The engineering question how does computer software work, and the scientific question why does it work?" Hopefully, this research will stop programs on your desktop crashing - or satellites falling out of the sky - because of some simple design error.

As technology improves, we are putting more and more personal information into our computers - photographs, video clips, correspondence, diaries, tax forms and so on. These are your "computer memories", but they bring problems. How can they be stored reliably over decades, and then read? How do you protect your privacy, especially when you are present in someone else's memory - an ex-lover's for example? How do you ask questions such as "find me a picture of me playing with my grandson when he was a baby"? The "memories for life" challenge hopes to solve all this.

Two other grand challenges will examine life forms. One of them - "In vivo - in silico" - will analyse the nematode worm, which has just 1,000 cells, 100 of which are nerve cells. "The Sanger Institute outside Cambridge can sequence 80 million base pairs [the keys of the genetic code] per day, far beyond the range of our current theoretical understanding of biology," explains Hoare. "Computer science can help to make sense of all this data." To meet this challenge, a new breed of scientist is emerging, one that understands both biology and computing. They could move on to understand the regeneration processes in plants and animals - and even to create self-repairing software and hardware on your desktop.

The final challenge moves from basic biology to "the architecture of brain and mind". This will bring together biologists, brain physiologists, nerve scientists, psychologists, linguists, social scientists and philosophers to work out how the grey and white mush of our brain can constitute the most powerful and complicated computer on the planet: our mind. Scientists have been trying to create intelligent robots for years, with little success. This grand challenge is having another go at understanding how to do this.

The challenges will not end up as instant software tools to run the world. That, says Hoare, is the "job of the entrepreneur". But the scientists can provide the theory behind those tools. All these challenges should focus the minds of scientists in other disciplines, and outsiders, on why they need computer science to understand the Global Ubiquitous Computer.

Wendy Hall, the new President of the British Computer Society, wants to make the grand challenges the centrepiece of her presidency. At the annual BCS conference, next March she is combining the research grand challenges with the challenges facing computer teaching, to attract the whole UK computer science community to attend. She wants to persuade them to "think bigger, not just incrementally". The conference will show her whether the challenges have enough support to carry on, ask for international help, and even win Government funding.

To anyone who doubts that computer science can help to build the Global Ubiquitous Computer, Hall points out that today's rudimentary global computer, the World Wide Web, which many people imagine to have sprung instantaneously from the head of Tim Berners-Lee in 1991, really borrows a lot of much older theory - text mark-up languages, hyper-links and packet switching. The world's favourite search engine, Google, she says, has a similar theoretical history.

In the same way, to be robust, the next Global Ubiquitous Computer urgently needs some bright new science to back it up.

The Grand Challenges website is