Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Big data is the future – but will it helps us to see the future?

The annuity business in particular has become a blatant rip-off: call it twilight robbery

Boyd Tonkin
Friday 21 March 2014 19:03 GMT
Comments
Alan Turing's enigma machine
Alan Turing's enigma machine (Getty Images)

Manchester has a lot to celebrate this week. Forget Robin van Persie’s hat-trick for United in the Champions League. Today the much-loved Central Library, whose rotunda has nurtured Mancunian talent since 1934, reopens after its £48m restoration. Wednesday’s Budget saw George Osborne announce further funds for research into graphene, the atom-thick super-material isolated at the city’s university by Andre Geim and Konstantin Novoselov in 2003. The Chancellor also unveiled the Alan Turing Institute for Data Science, backed by £42m of public money over five years. It will serve as a national memorial to the mathematician, code-breaker and computer pioneer who laid the intellectual foundations for modern life at the university’s Computing Laboratory.

It counts as progress of another sort that a Tory chancellor can pay tribute to Turing as a hero “persecuted for his sexuality by the country he helped to save”. Osborne wants the planned institute (which surely ought to be sited in Manchester) to act as the ultimate public-sector number-cruncher, a hive of mega-scale analytics “to ensure Britain leads the way again in the use of big data and algorithm research”. “Big Data” has become an ever-louder hi-tech buzzword since analysts such as Doug Laney began to spread the term a decade ago. It entails the gathering and organisation of the huge volumes of information spawned by a wired world to track, understand and – here’s the big “if” – possibly predict outcomes. In some utopian minds, big-data assessment promises a hard new branch of futurology. Others treat it as a smarter way of finding out what has happened to your parcel. (UPS was a leader in applying massive data sets to real-world problems of logistics.)

After the Budget, I asked Professor Viktor Mayer-Schönberger of Oxford University’s Internet Institute – co-author with Kenneth Cukier of the pioneer study Big Data – about the potential of the Turing centre. He says that “with big data we’ll have quite a novel way of making sense of reality (and derived from that, make often quite accurate predictions about the future)”. Yet “the processes by which we currently collect, analyse and make sense … are based on small data”, stuck in the time when information-harvesting was “expensive and time-consuming”. For him, “it will be like devising and learning a new language of science”. That will include “how we tell what we find – data visualisations are likely the new narrative, but how do we do that well?”. He hopes the new centre “will focus on this new language of science, much like Alan Turing helped us devise a language of logic”.

One of big data’s breakthroughs came in 2009 when it emerged that Google search records could track the likely path of new flu strains better than snail’s-pace reporting from public health laboratories. US star pollster (or poll analyst) Nate Silver delivered a masterclass in new-wave data-crunching when his cult site FiveThirtyEight forecast all 50 states correctly in the 2012 presidential election. At the time, the flaky augurers of the traditional media were still inspecting the chicken entrails and even predicting triumph for Mitt Romney (who?). By election day, FiveThirtyEight called for Obama with a 91 per cent probability.

Alan Turing, who was convicted of gross indecency in 1952, was granted a royal pardon last year
Alan Turing, who was convicted of gross indecency in 1952, was granted a royal pardon last year (Susannah Ireland)

Yet Silver is notably cautious about the predictive power of big-data research. He argues that uncritical immersion in vast vats of poor-quality stuff can lead to “more distraction, more false positives, more bias and more reliance on machine-learning”. In understanding the present, let alone divining what’s to come, size alone does not matter. As Silver’s book The Signal and the Noise puts it, we can merely “strive to be less subjective, less irrational, and less wrong”. Although he did once do some work on dating data, and the correlation between chosen nights and preferred outcomes. “On Wednesday, apparently, you get the highest ratio of people who are just looking for something quick and dirty.”

Behind the popular fascination with this often arcane corner of science lies our perennial itch to know the future and prepare for it. That universal urge also underlay another plank in Osborne’s Budget speech. By liberating pensioners from the need to convert their savings into a lifetime annuity, he also challenged the status of the 300-year-old science – or semi-science – of actuarial prediction. This was long overdue. Here’s my forecast: we will pretty soon look back on the time when people in later life had to trust their security to Britain’s greedy, incompetent and often (look at the mis-selling scandals) corrupt pensions industry as evidence of a near-criminal conspiracy. The annuity business in particular has become a blatant rip-off: call it twilight robbery.

Yet behind this besuited con trick stands predictive mathematics not that different in its day from the soothsaying coups of the number-crunchers now. Life tables to measure survival chances go back to the 1660s. It was Edmond Halley (of comet fame) who, in 1693, worked out how to calculate the premiums for annuities from them. Actuaries became in effect the first professional guardians and analysts of “big data” sets. Their theorems still draw the outlines of retirement for millions.

The British conspiracy that Osborne broke this week had a peculiar horror. It yoked the actuarial approach to dependence on the vicissitudes of bank rates and the fluctuations of the markets. Who in their right mind would ever devise such a system as the motor for comfort in old age? British institutions – so often in the iron grip of City interests – speak with a forked tongue. On the one hand, they exhort us to plan for the future, to squirrel away funds against rainy – or idle – days, and to assume that we have the power to steer our financial destiny. On the other, they effectively require us to chuck our nest eggs at the roulette wheel of casino capitalism.

No wonder the UK savings ratio is so low. Or that millions of people in Britain believe that the “big data” sets of the past century yield one message: invest in property. Hence periodic house-price booms, and the mass faith in lavish profits from a hard-to-climb “housing ladder”. As Danny Dorling’s landmark critique of our property delirium All That Is Solid makes clear, we have fallen for a giant Ponzi swindle. “Climb on board and you too will win at the expense of some sucker below you. But all Ponzi schemes eventually fail. Everyone cannot get rich quick at the expense of future property buyers.”

Back to big data and the tasks that await the Alan Turing Institute. All the indicators that impact on everyday lives – housing costs, savings returns, pension expectations – demand that we stake our chips on a fuzzy future. No amount of number-crunching by super-computers will ever bring that picture into a perfectly clear focus. In Big Data, Mayer-Schönberger and Cukier insist that the quasi-prophetic powers of information in oceanic volumes will never cover every angle and option. We must still attend to “the empty space, the cracks in the sidewalk, the unspoken and as-yet unthought”. Many people due to retire soon will have grown up listening to Doris Day warbling “Que Sera Sera”: “the future’s not ours to see”. Indeed not.

At present, we blunder about in a fog of imperfect and partisan information. Out of this mist we have to make future-limiting decisions. The lifting of the annuities yoke will only deepen that sense of uncertainty. George Osborne announced a new “right to impartial advice” for investors. He said little about the mechanism of its delivery.

As yet, most comment about the proposed Turing Institute has spoken of its benefits to business. Why not enlist it in the public service too? Democratise big data. Recruit it not merely to target the consumer, Google-style, but to empower the citizen. In theory, government backs public access to high-value information through projects such as the NHS Hospital Episodes Statistics. It even funds an Open Data Institute based – inevitably – in Shoreditch. Yet the debate about data use always turns on commercial applications.

If a cash-strapped government wishes to step back from cradle-to-grave obligations, then at least give everybody guided access to its ample resources of data computation to inform the extra choices that devolve on us. Between 1908 and 1911, the fledgling philosopher Ludwig Wittgenstein studied aeronautical engineering in – where else? – Manchester. (He even patented the prototype for a helicopter engine.) In a gnomic maxim, Wittgenstein later said: “If a lion could speak, we could not understand him.” The digitised world now blasts waves of raw information around us in a deafening lion’s roar. A new research hub should allow us all to interpret it, act on it, and plan possible futures. As well as harnessing its power for profit, the state could help every citizen hear – and tame – the lion of big data.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in