The blame is often misplaced. Computers need be no less reliable than washing machines. It is usually in the software - the instructions machines are ordered to perform - that the problems lie. A good rule of thumb is that a big computerisation contract will take twice as long and cost twice as much as it was supposed to, and deliver only half the intended benefits.
The DSS's best defence is to say that it is not only the public sector that has had its fingers burned at the keyboard. The London Stock Exchange's Taurus settlement system is only the most recent of a number of celebrated private-sector computing disasters. Every big accounts department has its horror stories. Unlucky firms - airline manufacturers, for instance, or the maker of a notorious piece of computer-controlled radiotherapy equipment that ran amok, delivering huge radiation overdoses to cancer patients - actually see deaths result from their mistakes.
One reason for the failure of big computing is the inconsistency of human behaviour: programmers who try to develop computer systems to run artificial worlds such as railways, tax systems or company accounts often find that the clear rules by which things first appear to be managed are in fact contradictory. Matters are not helped by the fact that a single error by any one of the hundreds of programmers working on a project can easily bring an entire system to a halt.
Pessimists may conclude that software cannot be trusted - that one should never fly in a computer- controlled aircraft or board a driverless train, and never rely on the automatic traffic management systems that Japanese and US companies are dreaming up for the next century. Optimists will point to the famous crash in the 5 million line computer program that runs America's long-distance telephone network. The surprise, they say, is not that the program cut off millions of customers when it went wrong but that its programmers managed to get it back up and running in only a few hours.Reuse content