Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Official immigration figures are known to be inaccurate – so what about the rest our national statistics?

It would be wrong to write-off all statistics as inherently dodgy, but the figures have to be subject to expert challenge – and the ONS should be more receptive to such scrutiny

Ben Chu
Monday 16 May 2016 09:42 BST
Comments
Coming in to land: statistics on immigration did not take sufficient account of arrivals through small British airports
Coming in to land: statistics on immigration did not take sufficient account of arrivals through small British airports (Jordan Mansfield/Getty Images)

Where do official statistics come from? From the Office for National Statistics (ONS), of course. But how are their figures actually produced? Where do they come from?

It’s not a question most of us tend to think about. And if we did my guess is that many would assume that there’s some sort of central supercomputer, which is automatically fed digital data from all around the economy – from shops, from factories, from banks, from port authorities, from hospitals, and so on.

The bit about banks is sort of true. Financial institutions do regularly pass on data on all the money in current accounts and all the loans outstanding to the ONS. Yet people will probably be surprised to learn that a great deal of the rest of the data that the ONS regularly produces about GDP, unemployment rates, wages, household incomes, wealth and the like comes from surveys.

ONS statisticians do not actually scrape digital information from the economy in the way that Facebook or Google automatically scrape data from their users’ account or from queries entered into a search engine.

What statisticians do is to ask a roughly representative sample of firms and households to fill out forms detailing what they’ve spent recently. They ask a sample of arriving passengers at airports what they’ve come to the UK for. They visit a number of shops and check the prices of popular groceries. And from these samples the “gross up” to create a picture of the whole economy.

So if a sample of 1,000 construction firms from around the country saw a total fall in output of a certain magnitude in a month, statisticians assume that all construction firms in the country saw a roughly similar output fall and that becomes the national statistic for construction activity.

If a group of families of middling income who took part in an ONS household finances survey saw their incomes stagnate over the past year, the ONS will conclude that the average income of all families in the UK stagnated. That will be published as the official figure. And so on.

Some readers will probably think this sounds a pretty flimsy way to generate important national statistics. Grossing up is bound to throw up inaccuracies. People will make mistakes, or even lie, when responding to surveys. Shouldn’t there be a more exact method?

But think about it, should all firms and households be wired up to some supercomputer to get a completely accurate picture? Do you fancy uploading your daily expenditure to an ONS terminal in your home every evening? Never mind the hassle, what about the privacy implications?

Perhaps that’s where we will, ultimately, end up, as the digital revolution unfolds and this kind of data becomes ever easier to transmit. We might not even have to think about sending it to the authorities one day.

But, in the meantime, there’s actually nothing inherently wrong with the ONS sampling methods. Provided the sample is big enough (taking in tens of thousands of firms and families in every survey) it can create a surprisingly accurate picture.

We know it’s accurate because an economy’s data has to add up. In a national economy, aggregate income must equal aggregate expenditure and this must equal aggregate output. And the data the surveys produce does – more or less – add up.

But not always, and we should also be wary of the pitfalls of the survey method too. Last week the ONS produced a report on a long-running dispute about the true levels of migration to the UK. The number of National Insurance (NI) numbers being distributed to European Union citizens, so they can enter the UK labour market, has been much higher than the official figures on migrants from these countries.

The ONS concluded that its main figures – based largely on the International Passenger Survey (IPS), which interviews up to 800,000 people entering and leaving the UK each year – were a more reliable estimate of the scale of long-term immigration flows to Britain than the NI numbers.

So how accurate is the IPS? Mervyn King, the former Governor of the Bank of England, pointed out to MPs in 2006, that the IPS’s sampling resources were overwhelmingly concentrated at big airports. Regional airports were barely covered.

King added that the grand total of airline passengers in 2005 arriving outside of Heathrow, Gatwick or Manchester who actually said to an IPS researcher “Yes I am a migrant coming into the UK,” was an almost comically small 79. And this was despite the fact that much of the increase in passengers arriving in the UK from Poland and other countries in those years was concentrated in regional airports. In other words, there was a major sampling error.

The ONS eventually admitted this deficiency in the IPS and, in 2009, it changed its coverage to beef up the presence at regional airports. But one respected migration expert, Jonathan Portes of the National Institute of Economic and Social Research, suspects the IPS is still not capturing the true picture on immigration.

Yet, the most glaring example of the deficiency of sampling approach lies in the ONS’s household wealth calculations. It is pretty much impossible to locate a credible researcher in this field who believes the ONS's Wealth and Assets Survey gives an accurate picture of the assets of the wealthiest, thanks to under-sampling of those at the very top.

“I doubt the Duke of Westminster sits down and fills out these forms” says Paul Johnson of the Institute for Fiscal Studies, referring to the UK’s wealthiest landowner. Most researchers prefer to calculate the wealth of those lucky few at the summit of the mountain from estimates from tax data and newspaper “rich lists”, rather than relying on what the ONS produces.

This is sensitive territory. It would be wrong for people to write-off all ONS statistics as inherently dodgy. That way philistinism and anti-scientific demagoguery lies. But at the same time the statistics have to be subject to expert challenge – and the ONS should be more receptive to such challenges than it has been in recent years.

As a recent Government-commissioned review of the ONS’s work by Sir Charles Bean concluded, the statistics agency should be more “intellectually curious” and collaborate more with outside researchers.

Where does that leave the rest of us though? What should we do when even the experts disagree over the statistics? Deepening our knowledge of where these figures, which profoundly shape our political debates, come from would be a good start.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in