There is no doubt that 2016 was full of surprises, some of which were unwelcome. But the shock at the UK’s vote to leave the European Union and Donald Trump winning the US presidential election led to blame being placed firmly at the door of opinion polls. Didn’t they, after all, tell us the opposite would happen?
The shock led to fingers of blame being pointed mainly at the polling companies. The clear inference being that if only they had done their job properly then the outcomes could have been avoided or at least prepared for. It is far too easy and wrong to blame the organisations concerned. They were a convenient scapegoat for politicians and some media outlets trying to explain their lack of understanding and failure to comprehend, let alone explain, Trump and Brexit.
There are a number of reasons why the polling seemed to get it wrong but much of it is overstated and overblown. It also fails to appreciate the role that the media played.
Most polls in the week before the Leave vote had the campaign ahead but only just. The final result was close. What confused many was that a swing back to the ‘status quo’ was expected. That obviously didn’t happen. In other words, a lead for Leave needed to be quite large if this effect happened, as it didn’t the small lead was enough.
In the case of the US, the polls were actually more accurate than they had been in the previous election in 2012. What didn’t generally get taken into account was the role of the Electoral College and how votes in individual states might be translated into Electoral College seats. The polls were often national polls and they put Clinton in the lead. Nationally that was the result, Clinton won by around three million votes. But she didn’t win in individual states, didn’t secure the necessary Electoral College votes and so lost the election. So we really considering the difference between state level polls and national polls. Many of the states are also small in terms of population, which makes getting a representative sample more challenging.
Press and poltical (mis)representations
Although polling data is invariable made available, the results are normally delivered through the prism of journalist and media interpretation. The results are sometimes picked to suit the angle of the story. Sometimes they are mis-represented, ‘over’ interpreted or just plain wrong. The other issue for polls is the media’s tendency to look for causal relationships that just are not borne out by the data. They then use the data to make a prediction that cannot be supported.
As Adam Drummond, senior research manager at Opinium recently explained to me:
“For all their recent issues, opinion polls are still the least worst option when it comes to finding out how somebody is going to vote. What needs to be better communicated however is the level of uncertainty around polls. We live in a very data driven world and the human brain is not very good at interpreting probabilities so we’re surprised when something that is supposed to be 70% likely doesn’t happen. While polls try to make public opinion easier to understand, they should be one of a number of information sources that people should use when making their predictions.”
There is also cherry picking, often without full analysis, by blogs and across social media. One key result may be shared but context is lost in 140 characters.
Polls can be expensive to run and some organisations simply do want to pay the costs. Not all polling companies are as reputable as others and may be prepared to cut their cloth according to budgets.
But even the best polling companies cannot control how their data is used. When the interpretation is really poor though, some have put out their own clarifications.
The National Centre for Research Methods held an inquiry into the polls for the 2015 British General Election. Although it found few significant flaws in approach, its recommendations led to changes in polling methodology especially guarding against over sampling the ‘overly’ politically active. In essence, those more likely to vote.
It turned out though that the ‘rules’ around referendums and turnout were different. People wanted to vote in very large numbers in the referendum.
But let’s not ignore politicians. They often abuse polling the most. Election leaflets are full of inaccurate bar charts that wouldn’t be allowed in schools and polling ‘results’ that fail to provide any idea of sample size, location, details of those talked to etc. The standards they apply to others to not seem to apply to them.
I will make a prediction. The use of opinion polls will not decline. They are still useful in all sorts of campaigns and the media still need an angle, something to grab them some attention or set their article apart from others.
Polls provide this so will not be going anywhere soon.