Polls were wrong because wrong kind of voters contacted

Polls were wrong because the wrong kind of voters were contacted in the weeks and months leading up to the UK’s General Election in May 2015, a new inquiry has found. Newspapers across the world are now wondering whether pollsters in their own countries have also made the same mistake.

Virtually all the polls overestimated Labour support and underestimated Conservative support. Most of them predicted a Labour win – the Conservatives won with an overall (12-seat) majority.

Some prominent figures, Mr. Cameron’s Australian election strategist Lynton Crosby among them, even called for polls, which clearly do not represent national opinion at that moment, to be banned in the final weeks preceding an election.

Polls got it wrong in UK electionMost polling organisations got their estimates on the UK’s General Election wrong.

Mistake due to ‘sample error’

After seven months of gathering and analyzing data on the several hundreds of polls published in 2015 before the UK election, a panel of academics and opinion experts, chaired by Patrick Sturgis, a professor of research methodology at Southampton University, concluded the fault lay in ‘sample error’, i.e. the people the polling companies were talking to did not represent the general population.

The inquiry reported that pollsters failed particularly to get apathetic interviewees to answer questions.



Prof. Sturgis and colleagues say that statistical adjustment procedures applied by polling organisations were ineffective in mitigating these errors.

Prof. Sturgis said:

“There were too many Labour voters and too few Conservatives in their samples. It seems it’s likely to be related to having too many politically engaged people.”

The inquiry, commissioned by the Market Research Society and the British Polling Council, follows a series of polling failures across the world. Several political parties and other organization have called into question the methods used by pollsters as they struggle to adapt to rapidly-changing voter behaviour in the age of the Internet.

Many companies that carried out polls in the UK said they found it hard to get certain demographic groups, especially young voters, to respond.

The conclusion that samples not representing the general population were the main cause of the polling error is supported by comparisons of the polls with probability surveys carried out after the election (the British Social Attitudes survey and the British Election Study). These produced more accurate estimates of the Conservatives’ lead over Labour.

Prof Patric SturgisProf. Sturgis’ research interests are in the areas of survey methodology, public opinion and political behaviour, statistical modelling, public understanding of science and technology, social capital, and social mobility. (Image: cardiff.ac.uk)

The Inquiry also concluded that the following factors were likely to have made – at most – a modest contribution to the total error:  the treatment of overseas voters, voter turnout misreporting, question wording and ordering, the treatment of un-registered voters, and the treatment of postal voters.

Landline to mobile phone shift

A significant proportion of Britons are unwilling to answer land-line telephone calls for fear of being hassled by sales calls and scammers. In fact, more than half of all young voting Brits have no landlines. Many people that do have landlines only keep them because they form part of a high-speed WiFi and TV package, and just use their mobiles for making and receiving calls.

The move from landlines to having only mobile phones is not just a British trend. The Pew Research Center, based in Washington, USA, announced earlier this month that three-quarters of its calls from now on will be made to mobile phones.

The British Polling Council (BPC) announced today that it welcomed the provisional findings of the Polling Inquiry on why pollsters overestimated Labour and underestimated Conservative support last year.

The BPC insists the Inquiry has been working wholly independently of its sponsors – the BPC and the Market Research Society.

Lord Foulkes of Cummock wants pre election polls bannedLord Foulkes of Cumnock believes pre-election polls should be banned. “The people who volunteer to be polled online or on the phone are part of a panel who inevitably become institutionalised. It is no longer a random sample of the population,” he said. (Image: Wikipedia)

President of BPC, Prof. John Curtice, said:

“Today’s unveiling will provide the polling companies and everyone else with an interest in its work to hear and respond to the Inquiry’s initial conclusions. It hopes that this process will prove helpful to the work of the Inquiry and ensure that its final conclusions are based on the strongest possible body of evidence.”

The Market Research Society (MRS) also welcomed the preliminary findings, which were presented by Prof. Sturgis on behalf of the independently-selected panel.

The final recommendations are due to be reported in March 2016.

Chief Executive of MRS, Jane Frost CBE, said:

“We would like to offer our thanks to Prof. Patrick Sturgis and his panel for the speed of their response and commitment to producing robust and insightful conclusions. We look forward to receiving their final report in March and acting on its recommendations.”

The Inquiry Panel members are: Patrick Sturgis, University of Southampton (Chair); Stephen Fisher, University of Oxford; Ben Lauderdale, London School of Economics; Jane Green, University of Manchester; Patten Smith, Ipsos-MORI; Will Jennings, University of Southampton; Nick Baker, Quadrangle; Mario Callegaro, Google;  and Jouni Kuha, London School of Economics.

What about late swing?

The evidence of late swing – voters changing their voting intentions between the final polls and Election Day – is ‘inconsistent’.

Several polls suggest there was a late swing in the favour of the Conservatives, while others showed none.

Given that polling organisations use different methods, the Inquiry team has been unable to come to a definitive conclusion on whether late swing made any difference or was a contributory factor in the polling miss.

The Inquiry concluded that if late swing did occur, it would have accounted for a tiny amount of the total polling error.

Southampton University wrote:

“A surprising feature of the 2015 election was the lack of variability across the polls in estimates of the difference in the Labour and Conservative vote shares.”

“Having considered the available evidence, the Inquiry has been unable to rule out the possibility that ‘herding’ – whereby pollsters made design decisions that caused their polls to vary less than expected given their sample sizes – played a part in forming the statistical consensus.”

“It is important to note that the possibility that herding took place need not imply malpractice on the part of polling organisations.”

Professor Sturgis said:

“There have been many theories and speculations about what went wrong in 2015 but, having considered the available evidence, the inquiry panel has concluded that the ways in which polling samples are constructed was the primary cause of the polling miss.”