Information superhighway? Bah!

Who is responsible for Internet gridlock? You are. Charles Arthur explains

Charles Arthur
Monday 28 July 1997 23:02 BST
Comments

Been on a busy motorway recently? Infuriating, isn't it, how when you want to overtake there are always people sitting in the outside lane, holding you up because they're held up by someone in front of them, who's held up by... It's a familiar picture. The same sort of behaviour is responsible for those exasperating episodes on the Net - the "gridlock" which appears out of nowhere and means either you can't log on at all (all the dial- up numbers are busy) or that sites you try to contact take ages to respond. "Host contacted. Waiting for reply" says the browser, and then nothing happens.

Now two scientists at Xerox's Palo Alto Research Centre (Parc) in California have put some numbers on the phenomenon - and warned that it could spell the end of the Internet as we know it unless we all start behaving better. In an article in last Friday's edition of the journal Science, physicist Bernardo Huberman and his student Rajan Lukose explain the gridlock problem as a classic case of the sociological dilemma of the "tragedy of the commons".

The original "tragedy" arose like this. The open parks of Britain's growing cities used to be free for all to use; and if you had a sheep, you could graze it there. They were nice, verdant places where the presence of a few sheep didn't affect things much. But as more and more people began using them the commons became overgrazed: the grass could not survive the intensive use. The commons became useless to everyone because nobody was in charge and nobody would look after them. The problem can be expressed more scientifically as: "Given a finite communal resource, individuals will seek to maximise their own gain. If there is no outside force keeping them in line, they will eventually destroy the resource for all."

On the Internet, individual users can't see how their use of the finite data links that carry data affects others, so they have no incentive to use less - even though that would have an overall benefit. The Xerox Parc scientists discovered this after developing a statistical model of Internet usage. They found that it is fairly steady most of the time - but hit randomly by sudden, steep increases in traffic, followed by less steep declines. (Simulations of motorway behaviour have found the same thing.) This is because when people online find things are slowing down, they stay with it, hopeful that things will soon improve. Others join them, until the network is virtually overwhelmed. At this point, people in their millions decide there must be something more productive to do, and log off.

Normally, online congestion follows the ebb and flow of the day; but Net "storms" appear completely randomly. The "storms" generally happen during a peak usage period - but not every one. Their very unpredictability makes them an interesting statistical problem.

The researchers studied the passage of these "storms" by timing the round trip for a "ping" (a test message saying "Are you there?" to a remote machine) from Stanford University to the UK, a pathway that is one of the most congested in the world. Tests on other routes confirmed their findings.

However, their suggestion for solving the problem is instantly unpopular - and, as one Net commentator pointed out, badly flawed. Professor Huberman proposed that to prevent such holdups, users should be charged in proportion to their Internet "consumption" - a "bit toll", rather like mileage charges on a toll road.

Apart from the fact that we in Britain already pay a rough toll (our phone bill), there are other problems. Who should bear the cost? It's too easy to charge the wrong person, according to Peter Neumann, a principal scientist at the computer science laboratory of SRI International in Menlo Park, California.

"Suppose I'm using Netscape and I go to somebody's Web page and they've got tons and tons of graphics," he said. "Just going causes it to start downloading those images. I may not even want to see it, but I'm going to get charged for it."

However, the "commons" can expand. Enlarging the network's capacity - as the telecomms companies are doing, feverishly - should help solve the problem. John Quarterman, who has been measuring the growth of the Internet for years, commented: "With the Internet, grazing more cattle causes more grass to grow. It's not a zero-sum situation."

Concern that the Internet is heading towards a "tragedy of the commons" is not new. In 1993 the "spamming" of Usenet newsgroups with commercial messages seemed to be about to head it that way. But the introduction of "cancelbots" which automatically detect and delete from news servers any over-posted messages has, at least generally, solved that problem. So those "commons" remain.

For the more general problem, a more popular solution among the Internet providers for weathering the "Net storms" would be to offer various qualities of service for differing amounts of money. The Internet Engineering Task Force, which oversees the underlying functioning of the Internet, will meet in Germany next month to discuss various models for doing just that. But don't expect a solution in the short termn

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in