Mellman: How pollsters’ backroom decisions affect the polling results you see
I never intended to be a poll critic. In fact, I’m more a poll celebrant.
Nate Cohn of the New York Times is one of the more celebrated members of our profession and properly so: careful, thoughtful, creative and transparent, he even answered my nebbishy questions about his latest polls on a Monday evening.
Cohn’s general conclusion from the Times/Siena College poll of six swing states comports with my argument a couple weeks ago here: The presidential race is poised on the knife’s edge.
But I want to use two specific findings to highlight the importance of the backroom decisions pollsters make.
First is the difference sampling and weighting can make.
The Times presents its results of two audiences — all registered voters and the likely electorate. Pollsters argue endlessly about how to define likely voters, whether to sample the likely electorate or likely voters, whether and when to survey all registered voters, how to weight various segments, and the like, but often it makes little difference.
Indeed, in four of the six state polls, the difference in the President Biden-Donald Trump margin between registered voters and the likely electorate is 1 point or less. In a fifth poll — Wisconsin — the difference is a slightly larger 3 points, with Biden ahead by 2 points among all registered voters but behind by 1 point among the likely electorate.
And then there’s Michigan.
In the Great Lakes State, President Biden is behind by 7 points among registered voters, but ahead by 1 point among likely voters, yielding a substantial 8-point difference.
Some outlets report the results among registered voters, others focus on the likely electorate. Both are accurate reflections of the poll, but different sampling decisions can end up telling rather different stories.
There’s nothing nefarious here, and while I don’t quite believe them (and neither does Cohn), the Michigan results demonstrate the substantial impact that backroom “technical” decisions can have on polls.
Another set of backroom decisions involve coding responses to open-ended questions.
The Times/Siena polls asked, “What one issue is most important in deciding your vote this November?” Voters responded in their own words, dutifully recorded, verbatim, by the interviews.
To make those words meaningful they must be categorized according to some rules. That’s how answers like “I just can’t make ends meet every month,” “food costs too much” and “inflation” all presumably get subsumed and counted under the heading “inflation and the cost of living.”
After analyzing these data, Cohn places some blame for the president’s political predicament on his support for Israel in its war against Hamas. “Around 13 percent of the voters who say they voted for Mr. Biden last time, but do not plan to do so again, said that his foreign policy or the war in Gaza was the most important issue to their vote.”
Less careful headline writers turned the lede into “13 percent of Voters Who Switched Support From Biden Cite His Gaza Policy.”
Cohn’s description is accurate, but a bit slippery, while the headline is inaccurate.
To reach his conclusion and generate the 13 percent, Cohn combined two different categories of open-ended responses. By far the smaller group—just 2 percent—specifically mentioned the Middle East, Israel, Palestine or Gaza.
A larger group offered some aspect of foreign policy, generic or specific, as their top issue, but did not mention Gaza or any related term.
Cohn tells me some of those people made ambiguous references to “stopping the war” (or the like) but didn’t say which war (Gaza, Ukraine, Myanmar, Sudan, the Maghreb, Somalia, Syria to name a few).
But most of those citing foreign policy didn’t mention a war. We have no idea what aspect of the Biden foreign policy was central to them.
In short, Cohn added a very small group that cited Middle East issues to a larger group that did not reference Israel, Palestinians or Gaza in any way as if they were saying the same thing.
Moreover, an unknown number of those for whom the Hamas conflict was central thought the president was giving Israel insufficient backing.
Cohn would no doubt respond that these folks sympathized with Palestinians more than Israelis, but as I’ve argued here before that’s a fundamentally flawed question. For many, “with whom do you sympathize” means “who do you feel sorrier for,” quite different from who you support.
The key point here though is that a decision to mix two seemingly quite different categories of responses produced a significant impact on the analysis and interpretation of the data.
You rarely see exactly what happens in pollsters’ backrooms (or more accurately on their laptops), but it can significantly affect the results you do see.
Mellman is a pollster and president of The Mellman Group, a political consultancy. He is also president of Democratic Majority for Israel.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.