What to make of the Muslim Census data?
Welcome to the 79th edition of The Week in Polls, which dives into some dramatic figures published by Muslim Census.
Then it’s a look at the latest voting intention polls followed by, for paid-for subscribers, 10 insights from the last week’s polling and analysis. (If you’re a free subscriber, sign up for a free trial here to see what you’re missing.)
But first two follow-ups to last time. First, when writing about pollsters buying-in panel data, I didn’t mention a major positive reason for doing so: it allows smaller pollsters to access large and fresh enough samples for a good poll. Otherwise, you’d have to build up a huge panel from scratch in order to be able to poll enough people and to avoid the trap of ending up polling the same people time and time again. That would greatly reduce the number of pollsters that could operate and would deter new pollsters from starting up. Buying in data can come with risks, and is not covered by the industry’s transparency rules, but there are also very good reasons for it happening.
Second, I should have written that people would support a target to remove fossil fuels from UK electricity generation by 2030 by 50%-30%. I wrote 2023 instead, which I suspect would have produced a rather different result… Apologies for that and thank you to Andrew Teal for spotting the error.
As ever, if you have any feedback or questions prompted by what follows, or spotted some other recent polling you’d like to see covered, just hit reply. I personally read every response.
Been forwarded this email by someone else? Sign up to get your own copy here.
Want to know more about political polling? Get my book Polling UnPacked: the history, uses and abuses of political opinion polling.
Or for more news about the Lib Dems specifically, get the free monthly Liberal Democrat Newswire.
The Muslim Census figures are dramatic, so are they correct?
At first glance, the figures published by Muslim Census this week seemed clear, and dramatic:
Labour and Conservatives at risk of losing majority of their Muslim vote following reactions to events in Israel and Palestine
The findings from our survey of over 30,000 British Muslims are live.
Labour move from 71% of the Muslim vote, to just 5%.
Though even in that there were two warning signs about how good this data is. One is the unusually large sample size, 30,000. Unusually large even for a normal survey of the whole national population and a sample size that would a real challenge for any traditional pollster to get close to matching from British Muslims alone.
The other is the size of that Labour change from 71% down to 5%. That would, if real, be completely off the scale for even the sort of voter support movements triggered by dominating national crises that are remembered for decades and become chapters in history textbooks. The Liz Truss premiership1, Britain crashing out of the ERM or the Suez debacle all had big impacts on political fortunes, but nothing close to this size of movement. Nor did the invasion of Iraq trigger movements of this scale.
The explanation? This wasn’t a poll. But rather a self-selecting survey of people: “the survey was distributed across several channels including mailing lists and social media”. Within 30 minutes of publishing the results, Muslim Census added a clarification that:
The survey above is not a poll, rather a voting sentiment survey of over 30,000 British Muslims.2
For example, some of the respondents had come via a tweet soliciting responses:
This sort of open-access polling, even if called a survey, does not get representative samples as it’s biased by who sees the invites and who is motivated to respond. It’s why the Daily Express had such a long record of publishing “polls” of what the public thinks that are widely different from what the public actually thinks because they were self-selecting surveys of Express readers.3
So not surprisingly, the Muslim Census figures came in for a fair degree of criticism, such as from Sunder Katwala, the director of British Future:
Likewise from one of The Week in Polls’s favourite professors, Rob Ford:
Sunder’s final sentence implies the best way, I think, to view these findings: view them as like a petition rather than a poll. That is, not a neutral enumeration of people’s views, but even so a sign of strength of feeling and interest in an issue.
Petitions are not polls. But it does tell us something if a petition gets 1 million signatures rather than 412 signatures. It doesn’t tell us that the 1 million view is the most popular view or even a very widely held view, but it does tell us something.
Indeed, when writing Polling UnPacked, I fessed up to how I’ve used such open-access surveys in the past - and how they can sometimes tell us something useful. I wrote about the online surveys I’d been involved in running during Lib Dem leadership contests:
Their record (originally with the Lib Dem Voice team) in polling Liberal Democrat members on internal party elections is pretty good. I had a mini-Gallup moment with my 2015 party leadership survey that put the contest much closer than many others were expecting, and turned out to be right. I had it as Tim Farron at 58 per cent to Norman Lamb’s 42 per cent. The actual result was 56.5 per cent to 43.5 per cent, respectively.
There were good reasons to think that, although not ‘proper’ polls, such surveys had some value, as they often matched up well against either actual results or proper polling. I put quite a lot of effort into trying to expand the reach of the surveys so that they would obtain a good cross-section of party views.
The key question was whether a self-selecting sample of online people doing a survey would be representative of the wider party membership. Sometimes that will be true, and sometimes not. What appeared to be the case at the time in all the elections I covered was that there was not a particularly strong activist-versus-non-activist split in support for different candidates, minimizing the risk that my sample (likely skewed towards party activists) would be misleading. Moreover, questions in the survey such as gender and the length of people’s party membership consistently showed that such samples were skewed – but crucially the data also showed that reweighting it made no significant difference to the overall figures.
In other words, the samples may not have been typical of the total membership, but the ways in which they were skewed did not seem to have an impact upon the balance of voting preferences…
I may also have been lucky to quit doing the surveys when I did, as I suspect the 2020 party leadership election would have seen the methodology fail.
Given the improbability I’ve mentioned of the Muslim Census results being an accurate representation of overall British Muslim views, they don’t seem to have been as lucky with their figures.
The figures do - when thought of more like a petition than a poll - add a little to the overall picture of Muslim opinion. I think it’s a mistake that the way they were represented made the figures sound at first like something they’re not; they’re not figures from a full scientific poll.
Above all though, perhaps, they highlight the absence of regular, high quality opinion polling of different minority communities in Britain. It’s a big gap in the picture that polling gives us of modern Britain.
Know other people interested in political polling?
National voting intention polls
Yes, once again it was a week without a poll putting the Conservatives on more than 30%, extending the run stretching back to late June (when a Savanta poll gave them 31%).4
Here are the latest figures from each currently active pollster:
Last week’s edition
High levels of interest in news from Israel and Gaza, and other polling news
The following 10 findings from the most recent polls and analysis are for paying subscribers only, but you can sign up for a free trial to read them straight away.
Keep reading with a 7-day free trial
Subscribe to The Week in Polls to keep reading this post and get 7 days of free access to the full post archives.