Q&A: How do opinion polls work?
- Published
The 2015 general election result took political pollsters by surprise and a panel of experts has now said that, put simply, their predictions were wrong because they spoke to the wrong people.
But what exactly is polling, how does it work and why is it so hard to get the right answer?
What is a poll?
In a nutshell, it's a scientific survey designed to measure the views of a particular group of people - for example, the UK electorate.
How do they work?
You don't have to eat an entire bowl of soup to know if it tastes good - if it's properly stirred, one spoonful is enough.
That's the analogy most commonly used to explain the principles behind polling - if you ask the right people, they can be relatively small in number but still give you an accurate representation of the views of everyone.
Who does them?
Well, anybody can carry out a survey, but those that pass muster - and reach a "scientific" standard - are done predominantly by reputable, independent polling companies. The 10 which featured most prominently in the 2015 general election were Ashcroft, ComRes, ICM, Ipsos MORI, Opinium, Panelbase, Populus, Survation, TNS and YouGov.
How to avoid getting misled by polls
They carry out some polls independently while others are funded by media organisations. And then some are conducted on behalf of a partisan client - a particular political party, lobbying organisation, or interest group - who may be commissioning the survey because they expect a particular response and may only make the results public if the answers are to their liking.
How do they decide who to question?
The perfect option would be to obtain a genuinely random sample of people, but in reality that's nigh on impossible. You'd have to pick them from the electoral roll, but that doesn't have any phone numbers or email addresses, so you'd have to show up on the doorstep of every single one of them - and then they'd all have to agree to talk to you. Alternatively, you could send them all a letter but that would takes ages and the response rate would be very poor.
Instead, polling firms use something called quota sampling. It involves designing a sample of people that's representative of the population as a whole, with the right balance of gender, age, socio-economic background, and so on. Interviewers then have to go out and find them to tick each one of those boxes.
Results can also be weighted, so if you only manage to find 100 people aged 18 to 34, but needed to find 110, you can weight the response of each one of those you did find to count as 1.1 people in order to better reflect the population as a whole.
How are people questioned?
Most commonly, it's done over the phone or via the internet.
One of the biggest firms, YouGov, does its polls online. , externalIt has collected a panel of more than 360,000 people in a variety of ways - including via standard advertising. Then, when it needs to conduct a particular poll, it chooses a quota sample from that panel. Those people are then asked to fill in an online survey.
Ipsos MORI is one of the firms to use phone polling., external Its explains that its call a combination of mobiles and landlines through random-digit dialling, which means they find ex-directory people as well as those in the phone book. The interviewer takes demographic details from whoever picks up so they can see where they fit in the quota sample.
Bear in mind though, that those who are unemployed, retired or work from home are more likely to answer the phone. And, as the UK Polling Report puts it:, external "There may also be attitudinal biases - people who are willing to give 20 minutes of their life to a stranger on the phone asking impertinent questions may have a different outlook on life to those who won't."
Why not ask more people?
A biased sample of a million people is no better than a biased sample of a thousand. The bigger the scientific sample the better, but a polling company - and the entity paying for the poll - has to balance the cost and time involved with the accuracy of the result.
The more people you question the longer it takes and the more it costs - and if you're trying to track day-to-day changes in voter sentiment, it simply isn't practical to question vast numbers of people each time. A sample size of around 1,000 is often settled upon as a good balance.
What went wrong in 2015?
An independent inquiry,, external commissioned by the British Polling Council, external and the Market Research Society,, external found that pollsters spoke to too many likely Labour voters and not enough likely Conservatives.
The biggest reason was that not enough older people - who are more likely to be Tory voters - were questioned.
Tory voters in general are also said by pollsters to be more likely to put the phone down or be ex-directory, and less likely to answer the door. And busy professionals who have less time to respond to polls are statistically more likely to vote Conservative.
Weighting should ease these sorts of problems, and has done in the past, but it appears it wasn't effective enough in 2015.
Those questioned were also more engaged in politics than average, which led the pollsters to make errors about the expected level of turnout among different groups. They knew that turnout would be higher among older voters than younger voters but they underestimated the difference.
Finally, the experts suggested there might have been "herding" - polling firms stuck together, not wanting to go out on a limb, so they designed their surveys and weighted their responses to fall closely in line with their rivals.
What do the pollsters say?
That they'll do better next time. They'll recruit more older people, rework their models to produce more accurate results. Joe Twyman, from YouGov, compared the situation to that of the Met Office after it failed to predict the 1987 hurricane. It went back to the drawing board, he said, found better ways to make predictions, and now generally speaking, people trust its forecasts again.
Will the pollsters redeem themselves in 2016?
However, Ben Page, chief executive of Ipsos MORI, has made the point that resources are an issue. Just three of his 1,500 employees in London are doing election polling and the money available to do it is "miniscule" compared to what's spent on other sorts of research.
All the pollsters also point out that even a "perfect" random sample, conducted on an infinite budget, will, just by the laws of probability, be outside the margin of error one time in 20.
What about the politicians?
It's rule one in the politician's handbook - always say you don't pay any attention to polls - but of course, the reality is quite different.
Labour MP Ben Bradshaw has accused, external the "commentariat" of being "too relaxed" about the pollsters' failure, saying he believes it affected the result of the election because all the talk in the run-up was about the all-but-inevitable hung parliament, meaning people voted on that basis.
Former Lib Dem minister Norman Lamb agrees, telling the BBC that endless talk of what would happen in the event of a hung parliament - particularly the doom-laden Tory warnings of an SNP-Labour tie-up - led people to vote "out of fear" for the Conservatives. He believes that if the polls had correctly predicted a Tory majority, people would have voted differently.
So should polls be banned?
Some countries, France, Spain and Italy among them, ban all opinion polling for a defined period in the run-up to an election.
Following last May's vote, one Labour peer, Lord Foulkes, called for just such a ban, external in the UK, accusing newspapers of using polls as a tool to sway voters, creating a "bandwagon" for people to jump on rather than providing an independent assessment of the public mood.
The suggestion was dismissed by ComRes chief Andrew Hawkins who said it would be "bad for democracy" and, given the existence of social and other online media, an attempt to ban opinion polls would be pointless. He also said it was "insulting" to the electorate to suggest they'd be swayed by poll findings.
- Published19 January 2016
- Published19 January 2016
- Published19 January 2016
- Published27 December 2015