Another example is a company faced with a new marketing strategy from a rival that threatens to make massive inroads into sales. A response, supported by the company's IT system, must be made soon. Rank Xerox, for example, is attempting, in a relatively short time, to jump ahead of rivals in office equipment manufacture by making closer links between the customer and its sales, service and administration divisions. This sort of reorganisation always entails substantial modifications to IT systems.
The culture in both British and American software development companies is to have a carefully staged process whereby requirements analysis is followed by system design, detailed design, programming and testing. Each stage normally generates quite a large quantity of documents: requirements specifications, feasibility studies, test plans, quality plans, test procedures and so on. Unfortunately, the demands being placed on IT systems are such that a working system or a major modification is often required within a period which would normally be a fraction of the time used for a conventionally developed system.
I recently organised a workshop for senior IT staff about what research needs to be carried out in IT over the next decade to solve the problems they are encountering. One of the principal messages from that workshop was that a main reason for lack of confidence in computing departments and outside suppliers was that they were seen as a brake on innovation. That software producers who used current development methods could not respond quickly enough to users' demands.
This is a message the Department of Trade and Industry has been hearing over the past two years. In response, it has decided to mount a special initiative to promote the development of research that would enable users, often via their IT departments, to modify their systems rapidly.
However, a number of groups could be adversely affected by this change in emphasis. First are those that advocate a careful approach to development of software, involving planning, requirements analysis, design, implementation and testing. This includes vendors of methodologies and proponents of total software quality assurance who are seeking to enclose their software projects within quality systems that insist on a large amount of documentation being produced.
The second group that would be affected are academics. For the past 10 years software engineering courses have stressed that software production is like any other engineering process and needs a carefully staged model of development. Many academics have ignored the fact that the techniques they have been teaching have not addressed the business objectives of the customer; that while the methods that are taught enable systems to meet technical requirements, they often lead to systems that fail because they do not respond to business requirements.
Software developers could rightly protest that they are now receiving contradictory messages from customers: of getting a system correct and at the same time getting it delivered or modified quickly. An understandable response from them would be to say that you can achieve one or the other - but not both. Fortunately, new software development techniques that have been piloted in the United States and the UK by companies such as Oracle and Yourdon are beginning to address this dichotomy.
These methods are based on a concept known as evolutionary development. With evolutionary development a customer communicates his or her requirements to a developer who quickly produces a prototype that can then be shown to the customer. Normally, this prototype is produced using a technology that results in quick delivery at the cost of poor response time.
A process then occurs whereby the user repeatedly suggests modifications, which are then incorporated into the prototype until, eventually, he or she is happy with the product. Normally, what has been produced is a system that fits well with the business objectives of a company but is deficient in some way: it may contain a number of annoying little errors, or it may be inefficient in terms of response time or memory requirements.
However, in a quick-moving business environment, where millions of pounds are often at stake, an upgraded hardware purchase can usually take care of the second problem, while the prospect of gaining a large market advantage over a competitor often drastically reduces the irritation that occurs with minor errors and response-time problems. Once the prototype system has been developed, the software delivery team can eliminate the errors, document the system according to the best dictates of quality and deliver a better version of the system to the user. This improved version can then be used as the basis for further changes.
Without this tidying up a system will become more and more baroque in its structure, to the point where even hackers would take a long time to modify it. What is surprising about the demands now being made by users is that they mark the probable rehabilitation of the hacker and potential formation of a strange alliance between two diametrically opposed cultures. Industry analysts are now coming to the conclusion that both conventional software developers and hackers will be needed in the future: the hacker for quick delivery or quick modification and conventional development staff to tidy up the mess that has been made.
The implications in terms of skills, job satisfaction and effect on the quality systems software developers use is massive.
The author is professor of computer science at the Open University.