Leading Article: The certainty of risk

Click to follow
The Independent Online
London Underground Ltd wants a relaxation of the fire regulations so that it need spend no more money on averting station fires that probably will not happen. The King's Cross underground fire of 1987, in which 31 people died, was the product of a culture of slovenliness. In the aftermath it was estimated that a big escalator fire could be expected once every 10 years. Subsequent reforms have reduced the estimated risk of station fires to one in 500 years, which is quite acceptable. But now LUL is faced with further huge bills to fireproof station walls, and it would rather spend the money averting the more urgent danger of train crashes.

This is the product of sensible thinking about risks and priorities, which is rare, and surprisingly difficult.

What is the likelihood that the world's first reusable spacecraft will crash when a huge rubber band cracks? Or that the biggest and safest ship in the world will sink on its maiden voyage? The answer in both cases is that it is certain. It has already happened to the Challenger and the Titanic. The only certainty when assessing risks is that there is no absolute security, and only fools and lawyers for the bereaved pretend otherwise.

The purpose of public policy, then, must be to keep risk at acceptable levels, not the attainment of perfect safety. Acceptable levels may seem in practice indistinguishable from perfect safety: there is no significant danger to any individual traveller of anything going wrong with a Western airline. Yet air crashes still happen. Terrorists still manage to plant bombs on planes, too, except, so far, those of El Al. But that airline's security procedures are so tedious and time-consuming that many passengers prefer to take the marginally increased risk and fly by other airlines. And it is important that they be free to do so. After all, even flying Aeroflot is probably safer, if less comfortable, than bicycling through central Birmingham.

One reason why the standards of aircraft safety are so high is that the consequences of any failure may be catastrophic. The risk of falling off a bicycle may be greater than the risk of falling out of an aircraft, but the consequences are usually much less dramatic. This is what makes so difficult the calculations of safety about nuclear power stations, and the waste they leave behind them. They may show very small risks of disaster, spread over a very long period, but no one knows yet all that that disaster might entail. Nor is it clear how to measure these risks against the dangers of global warming, or global impoverishment. In the last resort, such large questions are beyond the bounds of science. They can be answered only by fallible politicians.

The politicians' urge to pretend that there need be no hard choices between risks is understandable, and sometimes laudable. No one wants to fly in an aircraft patched up the way a bicycle can be patched up, as if no higher standards were attainable than those of an amateur bicycle mechanic. But the search for safety in the modern world carries its own risks. People may be tempted to rely on computers. Perhaps the grandest example of this dangerous folly was President Reagan's Star Wars project, which would have depended on millions of lines of untested computer code. Anyone who has sworn at an automatic teller machine will appreciate the lunacy of that President's search for safety.