Flicking through a day's newspapers often feels like tackling a numerical assault course. "Young people who use sunbeds increase their risk of skin cancer by 75 per cent". "Ninety-six per cent of children in European orphanages are not orphans". "In the UK we throw away 4.4 million apples a year". These are just three examples among dozens culled from yesterday's nationals.
Number-crunching has always had the potential to bamboozle, and today, more than ever, is the age of the fraction, the percentage and the average (but is that a mean or a median?). It's not just the newspapers either. Numbers crop up in adverts, health warnings and speeches made by politicians too. But do the figures add up? And do we trust them?
Not really, according to, yes, another set of stats that dropped into the Independent's inbox last month. In a survey by the Office for National Statistics, only 36 per cent of people asked thought that official figures were "generally accurate". Meanwhile, a 2007 poll of trust in government statistics by the European Commission ranked Britain 27th out of 27 countries.
Last week, a statistics watchdog was launched to tackle this apparent crisis in confidence. The UK Statistics Authority is tasked with ensuring statistics are correct and free from government spin. Every day, its website will provide links to the raw data backing up government statistics, and the Authority will, warns its chair, Sir Michael Scholar, "name and shame" ministers who spin them beyond recognition. "It's vital that statistics aren't altered to tell the story somebody wants to tell," Scholar says.
Kevin McConway, a senior statistics lecturer at the Open University, says statistical abuse damages his profession. "Statistics in the UK are actually pretty reliable but more and more often you see surveys that mean nothing, data that looks important but isn't, or statistics that are just made up. It destroys public trust in all statistics."
To name and shame some of the worst offenders, The Independent has trawled the archives for classic examples of "junk statistics", from the poorly worded reports to deliberate massaging of official figures, and asked McConway to read between the lines.
"Over 40 per cent of families spend eight hours or more a week together"
Commissioning a scholarly survey or study is a popular choice for companies who want to get their names in the papers. And disappointing results needn't get in the way of a bit of PR. Last month, the family holiday firm Center Parcs sent out a release designed to counter the image of the British family in terminal decline.
The headline read: "The Family – It's Not Toxic, It's Thriving". And the best stat Center Parcs could muster to back up its claim? "With over 40 per cent of families spending eight hours or more a week together... a new study suggests that, actually, families like each other and want to spend time together."
Is that really what it suggests? If 40 per cent of families spend eight hours or more together, that must mean that 60 per cent (or, to put it another way, the majority) spend fewer than eight hours a week together. And eight hours a week is equivalent to 68.6 minutes a day. Put it that way and the figures hardly endorse Center Parcs' vision of the "thriving" family.
McConway's verdict: "There's often this thought that 'oh it's numbers so it must be right', but often it's nonsense, especially when a company cherry-picks results and plays down the rest. It's OK, as long as we are aware of it and get enough information to work out the real statistics."
"Toothless post-menopausal women are three times as prone to hypertension as those with teeth"
This news, reported in the respected journal Hypertension, might have lead to queues of denture-wearing women of a certain age at GPs' surgeries. A study by Japanese researchers from Hiroshima University, published in 2004, suggested that tooth loss in post-menopausal women was directly linked to high blood pressure, which can increase the risk of heart disease or strokes.
But a look past the headlines revealed a problem: the scientists based the conclusion on a study of just 98 post-menopausal women – 67 with missing teeth, and 31 with their gnashers intact. In statistical terms, that is an almost insignificant sample size.
The problem is that the apparent cause of a link can sometimes be pure chance. The smaller the sample, the more likely this becomes. One statistician famously managed to find a statistically significant correlation (in a small enough sample) between birth rates in various European countries and the stork population, suggesting the birds therefore really do deliver babies.
McConway's verdict: "There's no standard minimum group size for statistical studies – it depends what you're measuring. If it's something that doesn't vary much – say, blood pressure in elite athletes – you could get away with a smaller group. But for something like this, you need a much larger sample."
"A sausage a day increases the risk of bowel cancer by a fifth"
Last month, research circulated by the World Cancer Research Fund (WCRF) suggested that eating 50g of processed meat a day – equivalent to one sausage – increases the likelihood of bowel cancer by a fifth, or 20 per cent. It sounds worrying. After all, a 100 per cent risk would mean you are guaranteed to catch cancer, and that figure of 20 per cent doesn't seem far off. But the reality is neither as simple, nor as scary, as that.
Research shows that, out of every 100 people, around five will develop bowel cancer within their lifetime. So what impact does eating sausages really have? If you take 100 people who eat 50g of processed meat a day, the amount of cancer will rise by a fifth – from five in 100 to six in 100. So, to 99 of the 100 porkers, eating all those sausages will cause no difference at all. But, of course, that seems far less shocking than the headline figure of a 20 per cent rise. And how many people eat seven sausages a week anyway?
McConway's verdict: "You barely go a week without seeing examples in the papers of stats appearing to indicate a significant increase. Twenty per cent sounds big, but it's only an increase on a small percentage – 20 per cent on next to nothing is still next to nothing."
"Speed cameras cause a 35 per cent decrease in deaths and serious injuries"
In 2003, the then Transport Secretary, Alistair Darling, issued a press release that read: "Deaths and serious injuries fell by 35 per cent on roads where speed cameras have been in operation." Darling went on to say: "The report clearly shows speed cameras are working... This means that more lives can be saved and more injuries avoided."
The suggestion that cameras caused the drop in accidents got Darling in trouble. Figures go up and down all the time. Contentious issues get more coverage when numbers (crime figures, say) are high. So the government does something about it. The numbers, having peaked, then go down. Naturally, the government takes credit for the fall. Challenged to prove the link in the speed-camera case, transport ministers revised their claim.
McConway's verdict: "This happens all the time. The statistical jargon is 'regression to the mean': over time, figures that peak or trough will, on average, head towards the middle, or mean. There are ways to take this effect into account when producing stats like these, but it does not always happen."
"The number of American children gunned down has doubled every year since 1950"
Sometimes junk statistics are caused simply by lazy wording. Perhaps the best (worst) example came in a prospective PhD student's dissertation, published in 1995. It appeared in the first chapter of Damned Lies and Satistics by Joel Best, who called it "the worst social statistic ever". It read: "Every year since 1950, the number of American children gunned down has doubled." Really? Let's do the maths. Say only one child was gunned down in 1950. According to our student, that number would have doubled every year, so two dead in 1951, four in 1952, eight in 1953... that makes 1,024 in 1960, and so on. By 1995, the year of the report, more than – wait for it – 35 trillion children were gunned down. That's really quite a lot.
It turns out that the student had taken the figure from a government report, which stated: "The number of American children killed each year by guns has doubled since 1950." So the figure had doubled over 45 years, not every year. By garbling his words, the student came up with a wildly inaccurate statistic.
McConway's verdict: "It's so important to be precise when writing about statistics. There are two conflicting pressures – to keep it simple and to tell the story properly. This case is terrible, but sometimes even statisticians get it wrong."
"Falling coconuts kill 150 people a year"
In 2002, in an article about the uprooting of coconut trees by lawsuit-wary Australian officials, the Daily Telegraph reported: "Coconuts... kill about 150 people worldwide each year, making them more dangerous than sharks." The figure appeared again in a press release issued by a travel insurance firm keen to assure holidaymakers they would be covered should they be struck by a coconut. The reports suggested the figure of 150 came from a Canadian professor, but his paper on coconut injuries did not posit a death toll. Various attempts to trace the origin of the figure have failed.
The case echoes a similar statistical legend – the belief that to be healthy we should drink eight glasses of water a day. Last week, researchers at the University of Pennsylvania decided to search for the source of this statistic. Their conclusion: "It is unclear where this recommendation came from." In other words, they could not find any study to support the "eight glasses" claim.
McConway's verdict: "The water statistic has, in fact, been doubted for years. The truth is simply that we like cold hard figures, especially when they make for a great story."Reuse content