There are few more treacherous stories for a political journalist to write than those that involve statistics.
Take just three recent examples. “Britain is growing faster than any other major advanced economy, official figures reveal”. “Up to 1.5 million people could be addicted to prescription drugs, MPs have warned.” Or “600,000 unemployed migrants are living in Britain, an EU study has found”.
At first these stories seem beguilingly straightforward. But take a closer look and they are all, to a greater or lesser extent, based upon statistical nonsense.
Take the growth story. Yes it is technically true that Britain is growing faster than any other major advanced economy – but only over the last three months. Over a more meaningful period since the Coalition came to power in 2010 only France and Italy have had lower GDP growth rates than us while the US economy has grown twice as fast.
Next up prescription drugs. The 1.5 million figure, quoted by the Home Affairs Select Committee, is actually an extrapolation from a 2001 poll by the BBC Panorama programme.
It found that three per cent of 2,000 adults surveyed had been taking benzodiazepine tranquillisers on prescription for longer than four months. Hardly up to date and hardly an authoritative figure.
Finally, the 600,000 unemployed migrants. This actually refers to the number of economically “non-active” EU migrants living in Britain. But “non-active” is not the same thing as unemployed. According to the European Commission, only 28 per cent of the total is made up of jobseekers, less than the proportion accounted for by pensioners (30 per cent). The figure also included students, the disabled and those who don’t need to work.
These examples – all deconstructed by the statistical fact-checking website Full Fact – show clearly three of the problems when dealing with statistics: selective use of figures, insufficient data and misinterpretation.
While newspapers, politicians and the public may misinterpret statistics, there is an even more significant problem if the data which is used to produce them in the first place is itself inaccurate.
And yesterday that is exactly what Sir Andrew Dilnot, chair of UK Statistics Authority, suggested might be the case.
He told the Public Administration Select Committee that UKSA is to launch an inquiry to try and establish how accurate so-called “administrative data” collected from across the public sector is.
The inquiry comes after UKSA found significant discrepancies in recorded crime figures suggesting police forces were deliberately under-reporting crime.
Although it won’t say so publicly, UKSA is concerned that other public-sector bodies may also be recording inaccurate statistics because they have an ulterior interest in doing so.
This matters because statistical information is used across the public sector to make policy and determine priorities, and if it is incorrect or incomplete we may end up making bad policy or pursuing the wrong priorities.
To give just one example: if crime appears to be falling than it might provide evidence to reduce police budgets. But if it is just not being reported then that reduction would be based on a false assumption.
None of this is easy or straightforward and the UKSA report is unlikely to be clear cut. But the basic point is clear: we the public need to care more about how we use statistics and the Government needs to care more about making sure they are accurate.