How useful were the polls? - European Movement
DEBUG:
DEBUG:
DEBUG: blog_post
How useful were the polls?

The polls failed to spot rising pro-Remain sentiment in May

London4Europe Committee member Michael Romberg compares the story told by the opinion polls for London with the actual results. A parallel blog published today looks at the results and asks What if?. A separate blog compares the results in 2019 with those in 2014.

The European Movement's members are fighting Brexit every day

Become one today to support their work and help us get a People's Vote on the deal:

JOIN FROM £3

List of charts and tables in this blog

This article looks at the London regional section of national opinion polls and one London opinion poll (Chart 4 has national figures).  As far as I know we have recorded all published opinion polls about the European Parliament elections voting intentions that have regional tables. The actual result is in the London charts and tables - shown as a square at the far right of each chart.

  • Chart 1: vote tracker - all parties
  • Chart 1A: vote tracker - six largest parties - moving average of last ten poll results
  • Chart 2: vote tracker only for Remain parties 
  • Chart 2A: vote tracker only for Remain parties but excluding polls that did not ask about all three parties; with trendlines.
  • Chart 2B; vote tracker only for Remain parties: only polls with fieldwork in May; with trendlines.
  • Chart 2C: vote tracker only for Remain parties - moving average of last ten poll results
  • Chart 3: vote tracker showing total pro-Brexit parties vote and total pro-Remain parties vote, with trendlines.
  • Chart 3A: vote tracker showing total pro-Brexit parties vote and total pro-Remain parties vote: only polls with fieldwork in May; with trendlines.
  • Chart 3B: vote tracker showing total pro-Brexit parties vote and total pro-Remain parties vote - moving average of last ten poll results
  • Chart 4: vote tracker showing total pro-Brexit parties vote and total pro-Remain parties vote: national figures.
  • Table 1: vote share tracker
  • Table 2: seat share tracker. Seats implied by the opinion polls using the D'Hondt method of seat allocation

 

UK/ GB trackers

You can look at a UK/ GB level tracker graph on the National Centre for Social Research's WhatUKthinks website here. NatCen allows you to group parties, for example into Brexit and Remain. Wikipedia also has a table of results for UK/GB opinion polls and a tracker graph with a slightly different selection of polls. Both sites show the underlying national data in tables.

 

The general story that the polls had set out 

  • Labour began as the clear lead party. However, by the beginning of May it had fallen back. The Brexit Party had risen. Over the course of April the Remain parties had risen and after the local government elections the Liberal Democrats became the leading Remain party. So by the end of the polling period, Labour, Brexit Party and Liberal Democrats had similar polling scores.
  • UKIP began with respectable polling scores. It was soon overtaken by and then eliminated by The Brexit Party who rose to more than twice the scores that UKIP had initially shown.
  • The three Remain parties increased their total vote share over the course of April but did not grow markedly further in May.
  • In April there was no clear lead amongst the three Remain parties. After the local government elections the Liberal Democrats pulled ahead to be the clear lead party. The Greens persistently showed a small lead over Change UK, but the difference was within the margin of error of the polls.
  • The Brexit-backing parties always outpolled the Remain-backing parties. 
  • The final polls would imply scores for The Brexit Party, Labour and Liberal Democrats of around 20-25%, Conservatives of around 10-15%, Change UK and Greens of about 5-10%, UKIP and other parties under 5% in total. The total Brexit-supporting party vote would be about 60-65% and the total Remain-supporting vote about 35-40%. 

 

How close to the result?

In addition to the margin of error on polls, one has to remember that people react to polls. So if the polls and the outturn are different, it may be because the polls were right and people changed their behaviour in response.

Now, looking at the actuals - the big squares on the right of the charts - we can see that:

  • The broad story (Brexit party riding high, Conservatives very low, Labour not doing well, Liberal Democrats well up) was reflected in the polls
  • The polls got Labour's position about right.
  • They somewhat overstated The Brexit Party's vote and substantially overstated the Conservatives' vote.
  • The polls understated the Liberal Democrats' and Green vote and correctly foresaw Change UK's position
  • In general, the story the polls told of the trends in the pro-Brexit and anti-Brexit parties' total vote share being broadly constant in May was wrong. The polls' story that the increase in the forecast Liberal Democrats' vote in May was coming largely at the expense of other Remain parties was therefore also wrong. The pro-Remain camp clearly kept growing. The trend line for the Liberal Democrats over the whole polling period (ie including the growth in April) is almost bang on, though the Greens' is still understated. 
  • The final tally was Brexit-backing parties 52%: Remain-backing parties 48%. That still does not reflect London's pro-Remain stance, but is closer to it than the polls had suggested it would be.
  • The large YouGov poll 8-17 May was pretty close; but the equally large YouGov poll 7-10 May not so close. Different timing? Chance? Size not that important?

 

Comment on party Brexit stance

Not all electors will have interpreted the polls in terms or pro/ anti-Brexit. For some voters Brexit/Remain was not the main issue.

Others would have failed to realise that Labour was pro-Brexit, or hoped that it would change to become pro-Remain. The decision by the People's Vote campaign to treat Labour as pro-PV will have made it easier for some Remainers to vote Labour. 

 

General comments about polls

Polls are just snapshots that come with a significant margin of error particularly so when you are looking at a small or very small subset of a poll, as almost all of these London figures were (we give the figures for the London sub-set in Table 1). ComRes publish a margin of error calculator. The London electorate is 5.5m. The largest poll had a sample of 1,111. The margin of error was 3%. So if the poll said that 15% would vote for party X, you could be 95% sure that the true result would be in the range 12-18%. The smallest sample was 83. Here the margin of error is 11%, so the range for party X would be 4-26%. For a more typical 200 London subset sample, the margin of error would be 7%. So one should never read too much into small differences or unusual results. 

Trends are a bit more reliable, but bring the problem that different companies have different methodologies (see more on that in a blog by YouGov's Anthony Wells). We provided graphs with trendlines and moving averages of the last ten polls. But that relied on the errors cancelling each other out. 

You will understand the problems that the wide margin of error on polls with small regional samples and the complexity of D'Hondt presented to those offering tactical voting advice at the regional level. London4Europe advised not to vote tactically, but rather to vote strategically for the Remain party of your choice. Our parallel blog published today as it were gives tactical voting advice with the benefit of hindsight (and reaches the same conclusion on the value of tactical voting).

There is a fuller list of caveats after Table 1, as well as notes and general points on interpretation.

A recent blog by London School of Economics academics at the Electoral Psychology Observatory shows why reading opinion polls is actually a lot more complicated than it looks.

 

What London4Europe had published

We published individual calculations for each poll showing the implied number of seats in London, three Eurotrackers that brought the results together and included graphs with trendlines, and a graph that smoothed the spikes by using moving averages. All these are available in the latest blogs section of the website. 

 

CHART 1: Vote Tracker - all parties: opinion poll points; and actuals

 

 

CHART 1A: vote tracker - six largest parties - moving average of last ten poll results; and actuals

 

 

CHART 2: Vote Tracker for Remain parties: opinion poll points; and actuals

 

 

CHART 2A: Vote Tracker for Remain parties - only includes polls that asked about all three parties. Note that some of those polls were taken before the 23 April launch of Change UK's European elections campaign  (29 March - applied to be registered as a party; 16 April: approved as a party). Trendlines calculated by Excel. The calculation ignores that the time interval between polls is not constant. Opinion poll points; and actuals

 

 

CHART 2B: Vote Tracker for Remain parties - only polls with fieldwork undertaken in May.  Trendlines calculated by Excel. The calculation ignores that the time interval between polls is not constant. Opinion poll points; and actuals

 

 

CHART 2C: vote tracker only for Remain parties - moving average of last ten poll results; and actuals

 

 

CHART 3: Vote Tracker showing the total pro-Brexit vote (Brexit Party, Conservative, Labour, UKIP) and the total Remain vote (Change UK, Green, Liberal Democrat). Other is excluded. The trendlines have been calculated by Excel. The calculation ignores that the interval between polls is not constant. Opinion poll points; and actuals.

 

 

CHART 3A: Vote Tracker showing the total pro-Brexit vote (Brexit Party, Conservative, Labour, UKIP) and the total Remain vote (Change UK, Green, Liberal Democrat). Other is excluded. Only polls with the fieldwork conducted in May. The trendlines have been calculated by Excel. The calculation ignores that the interval between polls is not constant. Opinion poll points; and actuals

 

CHART 3B: vote tracker showing total pro-Brexit parties vote and total pro-Remain parties vote - moving average of last ten poll results; and actuals

 

CHART 4: National polls - vote tracker - data taken from National Centre for Social Research - mix of UK & GB coverage - showing the total pro-Brexit vote (Brexit Party, Conservative, Labour, UKIP), the total Remain vote (Change UK, Green, Liberal Democrat, SNP & PC), and other.Opinion poll points

 

 

TABLE 1: Vote tracker. This table looks at the % share of the vote in successive opinion polls. It also gives the size of the London subset of the opinion poll (LSS). Actuals are in the final column.

 Table 1a

TABLE 1 - Vote Tracker - continued

Table 1b

TABLE 1 - Vote Tracker - continued

Table 1c

Table 1d

Notes

  • LSS = London Sample Size - the size of the sub-set of the poll that relates to London (occasionally the whole poll, when the whole poll is in London)
  • Brexit = Farage’s Brexit Party
  • Change UK, formerly The Independent Group of MPs
  • LD = Liberal Democrats
  • TOTAL BREXIT = sum of Brexit Party, Conservatives, Labour and UKIP
  • TOTAL REMAIN = sum of Change UK, Green and Liberal Democrats
  • "Other" is excluded from these two totals.
  • Totals do not always sum to 100 due to rounding.
  • Basis of allocation of Labour to Brexit: for example: 2017 general election manifestoJeremy Corbyn’s speech to 2018 party conference. The famous 2018 conference resolution only rejects a Conservative deal that fails to meet Labour’s six tests and no-deal; it does not reject Brexit absolutely. 2019 European Parliament elections manifesto.
  • Note that individual MEP candidates for the Conservative and Labour parties do not necessarily subscribe to their party’s stance on Brexit. We have published information on the Brexit stance of individual candidates here. Allocation of votes to Total Brexit/ Remain is based on parties not individuals. The closed list system means that voters cannot choose individual candidates on a party's list.
  • Party MPs, members, supporters and voters do not necessarily share their party's position on Brexit. Moreover, in elections people vote on many issues, not just Brexit. So, especially with Conservative and Labour votes, the extent to which the elections should be taken to be a proxy referendum is limited. 
  • For both of the main parties there are organisations campaigning to change their party's policy on Brexit. You can read up about them here in our blog and find their addresses so that you can join them. 
  • You can find the individual data for each previous poll with calculations and a link to the source on the latest blogs page.
  • Where individual tables included don't know/ won't say/ won't vote as a separate category, they have been taken out in the tracker table.

Some general points

  • In the early polls, respondents may not have given much thought to these being European Parliament (as opposed to Westminster) elections.
  • At the time of the first polls the Brexit Party and Change UK had not been founded. Then they were rumoured, then announced, then launched.
  • Polls before Labour's 30 April National Executive Committee meeting are before the confirmation of Labour's policy for the European Parliament elections.
  • Polls before the 2 May local elections do not take into account the results of the elections or the reactions to those results.

 

CAVEATS

  • Polls are just a snapshot in time. People’s views change. If “don’t know/ won’t say” is large, then their views could swamp small differences between parties once they decide. Similarly, if many people say they will not vote their votes could affect the result if they change their mind.
  • European elections normally have low turnout. Differential turnout amongst supporters of different parties could affect the result compared with a poll. Different polling companies have different methodologies for adjusting for turnout.
  • The definition of “London” in the poll may not match the London constituency for the election.
  • Polls come with a margin of error. On the highest level figures asking a question of the whole sample a rule of thumb is that polls come with a margin of +/- 3 percentage points (so a finding that 45% think this might be anything in the range 42%-48%). London figures are almost always a subset of the poll so the margin of error is larger. A good sample size for a whole poll would be 2,000 or so; think how much smaller the London number is than that - we now include the London sample size in Table 1 - see row LSS. Only the YouGov 7-10 May poll was just a London poll with 1,000 respondents all in London.
  • My calculations are on rounded numbers and that might introduce an error when results are close.
  • Different companies use different methodologies. So polls asked by different companies are not wholly comparable.
  • Small differences between polls do not tell you anything because of methodological differences and the margin of error which is quite large because the London sample is a small subset of the total sample.
  • Unusual results in a single poll do not tell you anything because the poll might be an outlier. Wait to see whether the effect is sustained.

 

TABLE 2: Seat tracker. This table looks at the seat allocation implied by successive opinion polls. Seats have been allocated using the D'Hondt formula. You can read up how that works here. You can look at the individual calculations with links to sources in a series of individual blogs with the title "Polls into seats" in the latest blogs section of the website. Actuals are in the final column.

Table 2a

TABLE 2 - seat tracker - continued

Table 2b

TABLE 2 - seat tracker - continued

Table 2c

Table 2d

 

This article was originally published on the London4Europe website here