Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

The Big Question: How much faith should we have in political opinion polls?

Sean O'Grady
Tuesday 24 October 2006 00:00 BST
Comments

Why are we asking?

Polls are under scrutiny by the political classes this week after a new one yesterday claimed the Tory leader David Cameron has, after a 10-month honeymoon, fallen out of favour with voters. A separate poll, also published yesterday, found that 51 per cent would prefer to see Gordon Brown as Prime Minister, compared with 24 per cent who hope to see Cameron in No 10. Today's Independent poll has bad news for the Government - two-thirds of the public believe the war in Iraq is unwinnable, and 62 per cent say we should withdraw as soon as possible.

How accurate are opinion polls?

The astonishing thing is that the opinion polls are as accurate as they are. Think about it. The pollsters contact 1,000 people and extrapolate from their answers the views of a varied, multicultural nation of 44 million voters. And yet it works. Polls do not always accurately predict election results exactly, but they are good at foretelling which party will get most votes. In a close contest fought under the vagaries of the first-past-the-post electoral system the actual parliamentary outcome will be hard to call, but the polls usually get to some approximation of the actual outturn.

Since the Second World War, only the 1951, 1970 and the 1992 general elections can be regarded as ones where pollsters might have got things badly "wrong". That said, the polls don't usually try to "predict" anything; they are a snapshot of public opinion. People are usually asked how they'd vote if a general election were held tomorrow, not in 2009, and we're bad at forecasting how we'll behave in the future.

So what else should we take into account?

Poll figures should be read in conjunction with results from local, regional, European or by-elections and with common-sense. The "mid-term blues" have long been a tradition in British politics. In the 1950s and 1980s, the Conservatives were often behind in the polls, yet they won all the general elections. The mid-term effect has been a less strong feature since the 1990s, when John Major never escaped his blues and, with the exception of the fuel protests of 2000, Tony Blair enjoyed an unbroken honeymoon with the voters for his first two terms. Three-party politics also make things harder to call.

What about the margin of error?

With regard to voting intention for parties, a poll rating is usually accurate to within 3 per cent. That means that say, a Labour showing of 33 per cent and a Conservative rating of 35 per cent could mean in reality Labour is on 30 per cent and the Tories on 38 per cent. Or that Labour is "really" on 36 per cent and the Tories on 32 per cent, so Labour rather than the Tories is in the lead. All within the margin of error: the Tories could be on a two-point lead or trailing Labour by 4 per cent. The 3 per cent margin of error applies in 19 out of 20 polls with a sample of 1,000.

What makes a bad poll?

Political polling is badly served by people unwilling to tell the truth (ironically given what the public say about politicians fibbing). This became acute in the late Thatcher/John Major era when fewer people were prepared to admit to pollsters that they were Conservatives. This helped to give the Tories artificially low poll ratings that then made it even more embarrassing for people to say they were going to vote Conservative (on top of the perceived stigma of being associated with the "nasty party").

This "spiral of silence" meant that "shy Tories" went underreported and helped to mess up the opinion polls (not least because they were more likely to vote). Perhaps New Labour may become similarly socially unpopular. A bad poll also asks loaded questions. Thus asking the public of they're willing to join the euro would probably get a different response to one where they were asked if they were willing to give up the UK's independence to join the euro. Similarly, human beings are naturally more inclined to say "yes" than "no" so, again, questions have to be framed carefully.

What makes a good poll?

Apart from a degree of luck, the more fastidiously it has been conducted, the better. In the past, the market-research companies weighted answers to get their samples to be more representative of the population as a whole. Thus, in the past, if there were too few elderly men in the sample and too many women in their twenties then the responses of the relatively few old boys would be given more bearing and the young ladies less, until they matched the demographic profile of the nation (as measured by the National Readership Survey).

Nowadays, more pollsters also weight their results to past vote, i.e. asking people how they voted at the last general election. A particular case in point is YouGov, an organisation that polls via the internet. Even now, web use is not universal, but YouGov argues that people are more likely to tell the truth on-line than via face-to-face street or door-to-door surveys, or over the telephone. As always, the question that's asked matters, but sometimes it's impossible to find one that can be accepted by all. Do you ask if people want to ban "bloodsports" or "fieldsports"? The Market Research Society has guidelines for the production of polls.

What about apathy?

People are less willing to vote, and the pollsters have tried to reflect this in their work. They now ask a range of questions to find out how determined people are to vote. An older problem is what to do with those who say they "don't know" how they'll vote. They are usually excluded from the results.

So should we get out of Iraq?

The Independent's poll suggests the public think so, possibly despite the fact they also think a civil war will follow if we do. When the answers are as clear-cut as they are in this poll, the margin of error is not a factor. However, how much such issues "drive the vote" is less clear: many voters might be against the war in Iraq but will still vote Labour in the next general election. It depends on the "salience" or importance of that issue. Iraq may matter less to people in 2008 or 2009 than now, especially as we know Mr Blair won't be a factor by then.

Should we trust the opinion polls?

Yes...

* They're suprisingly accurate when you consider the enormity of the task

* They are fine if taken in moderation and diluted by other evidence such as by-elections

* Even if they're occasionally ropey, they are all we've got

No...

* Polls have been wrong when it really matters, in close-run elections

* Only real votes in real ballot boxes count because people often lie to pollsters

* The public are fickle and volatile so it is pointless to worry about what they think today

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in