Ever since computers worked their way into global financial markets, a race has been on. Until recently, leading the field was the "new opportunities" team - those who develop and diversify the increasing sophistication of IT-enhanced financial instruments to produce market advantage and substantial profits. Following them has been the "financial regulators" team - those bodies from central banks and individual firms whose task is to monitor activity and set the limits on the exposure a bank or market can carry to these derivatives. But as the race accelerated, an unexpected shock hit the field: the earthquake whose epicentre was the collapse of Barings.
In an annual lecture hosted by the British Computer Society and Unisys last month, Howard Davies, deputy governor of the Bank of England, addressed this issue. "Risk in financial markets: is IT the problem or the solution?" was his title. At the heart of the race is the unprecedented opportunity offered by computers to collect and analyse information about market activity. The constant processing and assessment of prices and volumes has revolutionised exchanges and contributed to their substantial growth. But not without cost. As Mr Davies put it, "My overriding question is ... whether all this information is producing knowledge which allows market participants to manage risk better, or whether it is, by contrast, creating a more dangerous world?"
The Bank of England is not the only institution vexed by this dilemma. The European Union has done its own work on the threat, too, partly in the Capital Adequacy Directive, and also in the Basle proposals. The latter states: "Understanding and protecting against the vulnerabilities of a financial company's risk-taking activities is one of the major responsibilities of its board of directors and senior management. [They] must regard risk control as an essential part of its business to which significant resources need to be devoted."
As analysts and investigators have concluded from the Barings collapse, at least one finger needs to be pointed at the City bosses. It is partly bad management and partly unregulated greed. But in between those two stands information technology. In the case of Barings, had management installed a system to check on the activity of the traders, independently of the traders' own reports, then Nick Leeson's activities would have come to light. One author on the subject, Stephen Fay, even goes so far as to say, "Barings didn't spend enough on information technology, and I suspect that they are not alone in this. They'd rather spend the money on opening new offices and paying themselves large bonuses."
But that was more than a year ago, and since then the imperative of good risk management has come to the fore. The solution demanded of IT now is real-time, independent lines of reporting. But this requires a mountain of data to be brought online. What is being requested is a network that can bring to one point, more or less instantaneously, the details of the tens of millions of trades made each day, on the disparate market floors from Tokyo to London to New York.
The volume of the calculations then required to analyse the risk this data represents is vast. Data warehouses, of the order of the terabyte, need to be assembled. Prediction tools that employ the latest in artificial intelligence technology have to be developed. And the network itself must be protected against information overload.
No doubt the task will be done. But even then, as Howard Davies seemed to suggest, who is to say who will have won the race?Reuse content