Something huge, scary and unprecedented happened on 6 May 2010. If you live in the UK, however, you may hardly have noticed it – or even know that it occurred. Within half an hour, the US stock markets fell off a cliff. The Dow Jones index registered a fall of 998 points, the largest daily drop in its history. That included a crash of 600 points within five minutes: the span that David Leinweber, head of computing research at the Lawrence Berkeley National Laboratory in California, has dubbed "the wildest ever five minutes of market data". During this mad half-hour, around $1 trillion in value was wiped off the value of global companies. In Chicago, an automatic lock on trades kicked in. Most losses were recovered within an hour.
For a short while, the world's financial system – and with it the jobs, savings and pensions of us all – stared into a bottomless abyss. No one then seemed to have the slightest idea of why and how this "Flash Crash" had struck. Now we do. A variant of computerised market traffic known as High Frequency Trading (HFT) had in effect run riot, cutting loose from all human control. According to a study of the 6 May emergency by the US Securities and Exchange Commisssion, the algorithms employed by HFT programs were "clearly a contributing factor" in the Flash Crash.
With multi-million dollar trades zipping around the markets in milliseconds, at the behest not of human brains but alogorithmic formulae, "an important speed limit has been breached". Those words appear in a working paper on the risks of computer trading just issued by Foresight, the UK government's scientific advisory body. "It might be argued," runs this drily spine-chilling document, "that the more trading decisions are taken by 'robot' CBT [computer-based] systems, the higher the risk of such wild feedback loops". Free of human oversight, the "normalisation of deviance" by such programs multiplied outlier decisions and extreme events.
So have the humans now recovered control, even though automated trades now account for more than a third of UK equity business and up to 70 per cent in the US? Forget it. It seems that on 1 September 2010, a similar cyber-frenzy exceeded the peak volumes of 6 May, but "no official investigation was commissioned". The gas markets went similarly loco on 2 February this year, and oil on 8 June. "Currently", the Foresight researchers admit, "the scientific literature on complex nonlinear dynamics of networked systems is (in comparison to other fields) in its infancy with regards to concrete predictions and reliably generalisable statements", Or, debugged: we don't know jack. So keep your fingers crossed, punters - and savers, and pensioners, and employees.
Does the Flash Crash fail to ring much of a bell? Don't worry: 6 May 2010 was also the date of the General Election. Talk about a good day on which to bury bad news. Someone in Britain, though, had noticed the gravity of the events. Luckily for readers who wish to grasp the slenderness of the digital threads on which our economic survival hangs, it was the writer perhaps best equipped to make a mass audience appreciate these risks.
Robert Harris's new novel The Fear Index (Hutchinson, £18.99) races along as thriller of high finance set during a single day: that of the Flash Crash. I have to obey spoiler-alert protocols at this point, because it is very hard to summarise what Harris so grippingly achieves through this material without letting some conceptual cats (Schrödinger's, perhaps?) out of the bag. So, if you prefer, look away now and read the book. You will do so very rapidly.
Along the way, The Fear Index picks up some classic science-fiction elements – or rather, I assumed that the SF label would apply, until I began to research this piece. At root, Harris offers a version of the Frankenstein story.
He borrows an epigraph from Mary Shelley's foundation myth of the robot servant that turns against its scientist creator, which warns "how dangerous is the acquirement of knowledge". A prodigious teenager, Mary Shelley invented Dr Frankenstein and his Creature beside Lake Geneva in 1816-1817. Harris chooses that same shore as his scene. Characters' and place names nod to the febrile circle of friends and lovers – Mary, Percy Shelley, Lord Byron, Dr John Polidori - gathered in the Villa Diodati.
I would add another reference from the ramifying branches of the Byron-Shelley clans. In 1843, the mathematician Ada Lovelace – Byron's legitimate daughter – wrote a quite extraordinary set of notes in response to an Italian article about Charles Babbage and his ideas of the "Analytical Engine": generally viewed as the first expression of computer theory. Lady Lovelace's Objection, as it came to be known, stated that "The Analytical Engine has no pretensions to originate anything. It can do whatever we know how to order it to perform."
But what if such an engine did acquire the "pretensions" to originate something? From that premise and potential has stemmed not merely a still-flourishing school of science fiction, in which rebel robots go rogue and attack their manufacturers. Breaches of the first two "Laws of Robotics" framed by Isaac Asimov in 1942 ("A robot may not injure a human being or, through inaction, allow a human being to come to harm; A robot must obey any orders given to it by human beings") continue to power a thousand plots in print and on screen.
Doubting footnotes to Ada's Objection also began to shape the course of computational science itself. In 1950, in the journal Mind, the mathematical genius – and pioneer computer-builder – Alan Turing published a paper, "Computing Machinery and Intelligence". It famously begins: "I propose to consider the question, 'Can machines think?'" In principle, if not in the technological practice of 1950, Turing answers in the affirmative.
He sidesteps the philosophical theories of mind and simply posits convincing human-like machine behaviour as his criterion. The study, and later creation, of artificial intelligence starts in these pages. And Turing himself in his wartime code-breaking days at Bletchley Park would later supply an intellectual template – albeit one made conventionally heterosexual, unlike his gay and hideously victimised original – for the hero, Tom Jericho, of Harris's novel Enigma.
Over the single day of The Fear Index, a mutation of AI runs utterly amok. The novel's obsessive hero, American mathematician Alex Hoffmann, has turned away from pure science at the CERN laboratory to build up a $10bn. hedge fund on the Genevan shores. As you would expect, Harris clarifies the science behind the hedge in thoroughly accessible lay readers' terms, often via Alex's dialogue with his briskly upper-crust English CEO, Hugo Quarry.
Hoffmann, however, has improved on the hedgie maths that can determine an 83 per cent return on investments for his creepy congregation of super-rich clients. To him, and his far-from-merry band of geeks, he has "created King Midas out of silicon chips". Sensibly, Harris goes easy on the satire: we're all in too deep for that now.
The antiseptic offices where his super-qualified analysts – the "quants" - labour over screens may resemble "a United Nations conference on Asperger's Syndrome". But for them, and the boss, the science counts for more than the cash – although the odd $4m. bonus comes in handy. For Hoffmann, his fortune has become "a sort of toxic by-product of his research": a poison that has damaged his marriage to a Yorkshire-born artist, Gabrielle. In her works humanity and technology converge (a presiding theme) as CAT scans of bodies are imprinted on a form of hi-tech stained glass.
From the early hours of 6 May, when Alex reads a first edition of Darwin's The Expression of the Emotions in Man and Animals sent anonymously via an Amsterdam book dealer, we sense that some force plans to frighten him to death. And fear is the emotion that has allowed him to steal a march on his hedge-fund rivals via a trading algorithm, VIXAL-4. It profits from panic in the markets, "because human beings always behave in such predictable ways when they're frightened". For the algorithm to work, it must sift global media and tap into the overload of foreboding generated by 24/7 online news and chatter.
"Digitalisation iself," says Alex, "is creating an epidemic of fear". Measure dread, calibrate it for the markets, and you win. As it happens, a background paper for the Foresight study (from a team led by Gautam Mitra of Brunel University) remarks on the difficulty of extracting information about the "black arts" of this fledgling discipline of "news analytics": "Even the content vendors are unwilling to reveal information about organizations which utilize NA in algorithmic trading".
During the day, as market paroxysms in cyberspace alternate with material shocks in Geneva, the plot reveals the driver of Hugo's rising panic and terror. And the Frankenstein motif grows clear. In an eerily deserted warehouse by the airport, surrounded by unstaffed banks of super-computers, the childless Hugo feels moved "as he supposed a parent might be moved by witnessing a child for the first time unselfconsciously at large in the world". But not, of course, a docile child.
As fine examples of the genre always have, Harris's speculative fiction hothouses the seed of possibility that lurks within new technologies into a monstrous growth. Strikingly, the language of the Foresight report – compiled by cautious experts who perform their own exercises in intellectual hedging at every stage – often overlaps with that of The Fear Index. It even talks of "aggressive predatory algorithms".
In the real world, the scientists note, Credit Suisse runs a digital hunter-killer called "Guerrilla"; Goldman Sachs's include "Stealth". The paper submitted by Dave Cliff – professor of computer science at Bristol – reads uncannily like a commentary on Harris's novel. "Computer-designed and computer-optimised robot traders... could potentially come to replace current algorithms designed and refined by humans," Cliff predicts. "The simple fact is that we humans are made from hardware that is just too bandwidth-limited, and too slow, to compete with coming waves of computer technology."
Like many previous authors, Harris – or rather Alex Hoffmann – equates the ascent of cyber-intelligence with biological evolution. When Alex gazes at a scan of his "messy" brain after an assault, "Surely we can do better than this, he thought. This cannot be the end product." If so, then we had better pay heed to the automated markets where the resources that guarantee our well-being as organic entities increasingly reside. For Dave Cliff, "the performance of next-generation trading algorithms may be extremely difficult to understand or explain". Whether in fiction or in fact, we can no longer say we were not warned.
'The Future of Computer Trading in Financial Markets': www.bis.gov.uk/assets/bispartners/foresight/docs/computer-trading/