Whodunit to IBM?: Management guru Robert Heller takes a stab at what really happened to the company that had dominated the computer world until the 1990s
Sunday 20 March 1994
A year later, in March 1993, the doctors certified the death. John Akers, the chairman, lost his job - the first IBM chief executive ever to do so. An outsider, Louis Gerstner Jr, took over, bringing with him a posse of strangers. The old IBM was dead.
But who killed it? Was it the management? The competition? Or the customers? As in an Agatha Christie novel, many suspects are assembled in the drawing room for the showdown. Which is the guilty party?
The least culpable is the most easily blamed: technology. One can argue that the speed, force, depth and breadth of the second wave of computer technology would have swamped IBM in any circumstances. That was historical inevitability.
The little personal computer unveiled to such brilliant effect in 1981 was IBM's nemesis. Its primitive technology developed so rapidly that within a decade the baby would challenge all but the most powerful of its parent's computers - and leave them helplessly behind in cost. Whatever IBM's management had done, its mainframes - and, not far behind, its minicomputers - were bound to stagnate.
The essence of the new technology was ease of replication and communication. IBM strove to hold back the tide, but the technology could not be withstood. In a world of ridiculously cheap computing power that was getting cheaper all the time, packaged software programs and individual networks, nobody could maintain proprietary control. That meant, eventually, the loss of the account control that cemented the corporation together. IBM could do nothing to stem this loss.
The technology needed an accomplice: the IBM culture. The conspicuous end-product failures were bad enough - the four-year lag in PCs, the 11-year lag in minis, the five-year lag in both laptops at one end and engineering workstations at the other.
The corporate damage - long-term and short - would have been limited, if only IBM had kept abreast with mainstream technology in its product lines. Behind those painful delays lay a trail of missed deadlines, lost opportunities, mistakes and misunderstandings in key technologies.
For all the protestations of its leaders, the modern IBM was never, first and foremost, a powerhouse of innovation. Its technological resources are immense, but its procedures and principles have retarded the progress of products to market. The culture was supposedly both customer-driven and innovation-led. In reality, the corporation was geared to eschewing risk and maintaining margins - whatever the technology might make possible, whatever the customer might demand.
On one argument, IBM's product lags were a deliberate and integral part of this overall policy. If you dominate a controlled market - in which IBM is as synonymous with computers as Kellogg's is with corn flakes - you can see the attraction of delay. If IBM shunned an emergent sector, its potentially disruptive growth would be stunted, possibly aborted. In launching its own PC, was IBM seeking to create a market or to control it?
The latter interpretation would explain why IBM underestimated initial demand for its PC so seriously: it could have been wishful thinking in reverse. But the customer had wishes, too. Once personal computing had been legitimised by IBM, the latent desire for computing on the desktop, under individual control and infinitely flexible, burst out of the closet.
The customer is also a suspect. Loyalty to IBM was still unchallenged in the 1990s. The brand, thanks to sedulous marketing down the years, remained one of the world's best known and most respected - the only office equipment brand to rank alongside mighty consumer equivalents such as Coca-Cola and McDonald's. But brands often maintain historic strength when current market penetration is shrinking fast. Even the most loyal customers will switch brands for several reasons - some positive (price, performance, choice), some negative (bad service, poor quality).
IBM was once widely believed to bind customers to its products with its superlative service. True, commentators also pointed to the 'FUD' factor - Fear, Uncertainty and Doubt. The fearful could reflect that 'nobody ever got fired for buying IBM'. The uncertain could reassure themselves by placing their computing futures in the hands of IBM. The doubtful were spared from the necessity of risking the company's money (and their own careers) on backing a non-IBM supplier or solution.
How much of IBM's hold on the market was positive, how much negative? The steady decline of the company's Fortune rating by its peers points to increasingly negative feelings. IBM's sales strategy was directed at the Fortune 500, America's leading corporations. The worsening appraisals presumably reflected increasing dissatisfaction with IBM's handling of their accounts.
The sharp increase in Digital's share of the corporate market with VAX machines - computers able to intercommunicate fully and freely - was further evidence that IBM's hold over its largest and most preferred customers was weakening. Delays in delivering promised products (both hardware and software) must have added to the disillusion and irritation of IT managers. It did not matter whether the delays were inadvertent or deliberate.
As a marketing ploy, premature announcement of pre-emptive products can be effective; but for every pre-empted competitor - a plus - there are many frustrated customers - thousands of minuses.
If the frustrated buyers were locked into IBM configurations, too bad. They had, willy-nilly, to wait. But the advent of compatibility, above all in PCs, removed the need to wait. And PC customers had no reason for fear, uncertainty or doubt. They bought on three counts - price, performance and product features. They could judge these perfectly well for themselves, or with the aid of a booming, expert specialist press - and IBM was beaten too easily on all three counts.
That was no accident. In one respect, IBM had been absolutely right in all its courtroom protestations in all those victorious lawsuits. The industry has always been a competitive battleground. The competitors, in seeking their own success, necessarily aimed to inflict maximum damage on IBM. For many years, IBM was able to repel all would-be assassins. But their numbers kept on multiplying - not just mainline hardware rivals, but plug-compatible peripheral suppliers, leasing companies, added value retailers, software companies of all kinds, microprocessor manufacturers.
The monster grew too many heads for Hercules to lop off. As one fell, anyway, a hundred sprouted in its place. This overflow of competitors suggests another suspect: the market. As the suppliers proliferated, so the economics deteriorated. In 1992, Mr Akers could blame his company's operating losses (among other factors) on intense price-cutting in the PC market. Weakening prices across the board had undercut all his efforts to arrest decline.
The pattern of over-supply and commodity pricing has turned the PC dream market into a potential nightmare. But Compaq and Apple, IBM's main competitors, were both profitable in 1992. Unlike IBM, neither had other multi-billion-dollar product lines to offset the problems in PCs. The market may be an accessory after the fact but can otherwise, on this evidence, be eliminated from the inquiry.
Was the murder the work of two traitors? Was IBM stabbed in the back by two companies that grew rich on that back: Intel and Microsoft? By building their own businesses outside IBM, the pair automatically reduced IBM's market share.
Every Intel 286, 386 and 486 chip sold to non-IBM customers - like every DOS operating system - was a loss to IBM. By 1992, Microsoft was in such open combat with IBM that some observers consider the fateful decision to commission the operating software for its PC from young Bill Gates, Microsoft's founder, was the real villain.
If IBM had maintained proprietary control over the operating system and the chips, would that have protected its future? Apple's story does not encourage that theory.
For all its trials and tribulations, Apple sold the wholly incompatible Macintosh so successfully that it closed the gap with IBM in the US - despite lower credibility with large corporate buyers.
If cloning the IBM PC had been harder, other IBM attackers would have developed their own operating systems or taken out licences from IBM - perhaps with help from anti-trust authorities.
As for microprocessors, cloners were by 1992 working on Intel's Pentium chip before it had even reached the market. At best, IBM could have delayed the inevitable. Sooner or later, the technological genius and entrepreneurial drive of the super-nerds would have broken the giant's hold. In the super-nerds' wake, the copyists would have come thundering through.
If a proprietary IBM had resisted or missed the technological trends (for instance, the miniaturisation that magically created laptops and palmtops), it would simply have created greater opportunities for others. Like the car manufacturers of Detroit, with their vain resistance to lower-profit, smaller cars, IBM could not hope to stop the future; not where geographical and economic barriers to competition were breaking down universally.
Enter another suspect - the change from a cartelised world economy, controlled by economies of scale, to a global marketplace. Within that arena, sourcing has become universal, and efficiencies, rather than scale economies, are decisive.
The Asians, led by the Japanese, have murderous credentials in this respect. Japan's only truly successful end-product attack in computers has been in laptops. Even so, its mastery of vital component technologies - from random access memories to flat screens - has undermined IBM's control of the market.
The vertical integration for which IBM was famous merely condemned it to costlier and inferior technologies. It enabled competitors to beat IBM on the vital factors of price and performance. The company's reaction to this looming threat, a dollars 10bn investment in automation, made the problem worse, lowering the productivity of capital, locking the corporation into long, unwanted product runs, and missing the true policy need.
The detective, like Hercule Poirot on the Orient Express, thus has too many suspects: the Technology, the Culture, the Customer, the Market, the Supplier turned Competitor, and Globalisation. Like the train murderers, all stabbed IBM - repeatedly. It was not the death of a thousand cuts, but a million. Yet none of the blows would have been fatal - apart from self-inflicted wounds. For the final suspect is: Management.
Ineluctable external forces will always, at times, swamp the best of managements. But their supreme test, the proof of whether they truly are 'the best', comes in times of such tribulation. IBM's top management, by that criterion, failed. Performance deteriorated on every significant corporate measure, financial or non-financial, over the Akers years. Reading the chairman's annual statements is to enter the 'all's for the best' world of Voltaire's Dr Pangloss. The repeated optimism in the face of mounting crisis amounts, on the most favourable interpretation, to massive self-deception. To argue that nothing within IBM's power would have achieved better results is not acceptable. Managers are paid to succeed, not fail.
The problems were (and are) hideously complex. IBM's top managers made them worse by the series of false starts along the road to radical reform. The famous military confusion - 'order, counter-order and disorder' - accurately describes the process that destabilised the corporation. It did so without arresting the decline in its competitive prowess. There were no compensatory benefits.
The dismissal of Akers, and the appointment of a new top management, removed one issue from the argument. The radical measures launched at the end of 1991, and prosecuted with some vigour thereafter, will be replaced in whole or part by new strategies. Whether or not they might have succeeded is an academic question. But terrible damage has been inflicted that cannot be repaired or reversed. Akers, a pre-eminent product of the IBM mould, could not recognise harsh realities that cut against the corporate grain.
The realities sprang from a common source. The slowdown in mainframes and minicomputers; the sensational rise in power of personal computers at one end, and their descent into commodity status at the other; the astounding progress in miniaturisation; the collapse of proprietary systems and the rise of open ones; the newly asserted independence of customers; the challenge of the software and microprocessor firms - all are symptoms of an irresistible upheaval. Changes in technology and changes in society (as usual) feed off each other.
By the millennium, and probably well before, the fate of IBM will be settled. It could become a shadow of its past - a basically centralised, bureaucratic, demoralised hulk, tied up in failed alliances, with a steadily leaking market share. At the opposite extreme, it could become a successful federation of genuinely independent states, each with their own spheres of competence and circles of influence. That would be the ultimate irony.
Throughout the 1970s, IBM devoted great energies to avoiding break-up by the Federal trustbusters. In its closing years, the Akers regime started to attempt a controlled break-up. In everything save equity ownership, the Akers plan would probably have been approved by the anti-trust authorities. The break-up they accepted in the contemporaneous case of American Telephone & Telegraph was similar. It forced the creation of smaller, regional 'Baby Bells', which generated a near-threefold rise in value for shareholders from 1982 to 1992.
That represented dollars 132.4bn of wealth creation, six times the advance in the worth of IBM. Maybe turning IBM into a loose grouping of a dozen or so separately quoted companies would work the same magic. But the snags, as new chairman Lou Gerstner quickly discovered, are legion and literally large. If size is one of the ailments that has enfeebled IBM's performance, break-up cannot, by itself, cure the disease. The reason is simple.
The Akers re-division of the business is topped by four huge units. The mainframe-related operations, with sales of dollars 22bn, are bigger than United Technologies, which is sixteenth in the Fortune 500. The next three, with sales ranging from dollars 11.9bn to dollars 11.4bn, are all larger than the great Motorola. This leading group of manufacturing and development businesses is followed by a second bunch, also relatively big by industry standards.
One of these dollars 2bn-plus runners-up outranks disk-drive makers Seagate (166th), another two are each bigger than Tandem (217th). This second set of software, peripheral, component, data transmissions and service companies are not viable independent competitors in their present shape. They are either too dependent on IBM's captive markets or are not fully equipped for the rigours of the marketplace - or both.
The marketplace looms as an especially large problem given that Mr Akers established four other companies in marketing and service. Two are gigantic (covering North America and Europe-plus), two merely huge (Asia and Latin America). Together, they account directly for the overwhelming bulk of all IBM sales. Setting these free would create 'hollow' corporations - making nothing themselves - that go far beyond anything yet envisaged. Could they hope to compete successfully against smaller, tighter, integrated operations such as NCR or Apple?
Their IBM suppliers would be in a comparable bind. Could they generate enough profit when so much added value would disappear into the pockets of the trading companies? Would not IBM's problems of the 1980s and 1990s - the overlaps and excessive layers - be aggravated, rather than eased?
Small wonder that a Fortune writer, David Kirkpatrick, commented: 'IBM faces practically every challenge known to management.'
So does Mr Gerstner. On the evidence of the fumbling of the 1980s, those challenges would certainly have defeated the best efforts of the Ancien Regime in the 1990s. The whole saga is reminiscent of the decline and fall of the Roman Empire. Its causes have never been finally determined - but it was irreversible and awesome. The wasteful excesses of Rome, the incursions of the barbarians, the economic cancer eating away at the empire's vitals, the incompetence of the emperors - it was all these things (which have their parallels at IBM) and none of them. The rise of Rome had set in train great, sweeping trends that its rulers were unable to comprehend fully or control.
The grandeur of IBM, like the grandeur that was Rome, is now history. During that history, IBM played a glorious, magnificent, imperial part. No group of people anywhere in America or the world approached the immense contribution of the IBMers to the Age of Information. This deeply committed, worldwide community translated the technology into brilliant products and placed them in the hands of users whose applications drove the entire global economy forward.
IBM was perhaps essential for the unfolding of the Age; possibly its rapacity - the reverse side of its ambition - played a part similar to that of the Robber Barons. Their robbery created America's gigantic industrial wealth after the Civil War - possibly an indispensable, certainly an indefensible catalyst. But in the 1990s, IBM has lost its indispensable role. If it did not exist, it would not be necessary to invent it.
Can the new regime 're-invent' the great corporation in another, but still recognisable and successful form? Miracles can happen, even in management. Nobody can predict the outcome of the corporate upheavals currently going on inside IBM. If the reforms do succeed, that will be an achievement of unexampled brilliance. The future of the large corporation and, to a lesser extent, the fate of the West's advanced technology will both look far more secure.
The progress of that technology has been history's most fascinating display of human initiative, ingenuity and creativity. Even the ferment of the Italian Renaissance made less impact on human lives. For a long time, IBM seemed to embody all the hopes and fears raised by that progress.
With the death of the old IBM in February 1992, the torch has passed, not only to its new leaders, but to those in many other companies. Their task is to build a new, multi-faceted, democratic order to supersede the empire that, in the final analysis, killed itself.
Heller Arts Ltd 1994. From 'The Fate of IBM' by Robert Heller, published by Little, Brown, pounds 16.99.
(Photographs and graphs omitted)
tvSteven Moffat reveals the actor was dying to take on the role of the Time Lord and says he is excited to see what he will do with the character
Nelson Mandela memorial sign language interpreter was a 'fake'
The ten coldest places on Earth
Sir Ian McKellen hits back at Damian Lewis' 'fruity actor' claims
Kenyan politician Mike Sonko left red-faced after photoshopping himself next to Nelson Mandela
Nelson Mandela memorial: Cheers, jeers and a masterclass from Barack Obama that stole the show
- 1 Nelson Mandela memorial sign language interpreter was a 'fake'
- 2 It’s shameful that our universities have accepted gender segregation under pressure from the most oppressive religious fanatics
- 3 Sir Ian McKellen hits back at Damian Lewis' 'fruity actor' claims
- 4 Kenyan politician Mike Sonko left red-faced after photoshopping himself next to Nelson Mandela
- 5 Selfie at funeral: Cameron squeezes in on Obama snap at Mandela memorial
- < Previous
- Next >
iJobs Money & Business
£45000 - £55000 per annum + Training and Benefits : Harrington Starr: A leadin...
£59999 - £80001 per annum + Benefits: Pro-Recruitment Group: Senior Manager in...
attractive: Citifocus: Highly prestigious Investment Management house based in...
£70000 - £90000 per annum + benefits + bonus: Pro-Recruitment Group: Our leadi...