Psycholinguist here. The two findings from the beginning of this post, ICE approval in YouGov and the generic ballot results from your Verasight poll, are both examples of framing effects--where you can present the same (or very similar) information in a slightly different way and get readers to come to very different conclusions. Both findings are interesting and in line with previous research, but I would caution against making strong inferences about political strategy based on them. Framing effects tend to be pretty transient, because they're about how people construct situation models to understand a text while they read it. People aren't really changing their minds, they're just evaluating the same question in different framings, where different things are salient.
I'm again struck by the knowledge that polls are inherently biased, not so much as a result of construction (sample population selection bias, wording bias, etc) but by the very ignorance of the survey respondents themselves. I consider myself to be very aware of current affairs, and I have a fairly deep understanding of how our government works on all levels - local, state and federal, and the issues facing our continued existence as a democracy. When reading results of surveys, my personal bias shines through because I inherently assume that all respondents are as equally informed as I am. This is obviously not the case. How then can any survey be trusted? The problem is not in our stars, dear Brutus..........
I'm a bit confused by the results of your poll, or, at leas, the way you describe it. Based on the graph, there is only a 1-point difference between responses in the informed vs. uninformed group. Although the sample size is large, I'm guessing this still falls within the margin of error, and would suggest to me that, although a large proportion of voters is lamentably poorly informed, they know who controls Congress. What am I missing?
I'm a psycholinguist, this is what's called a linguistic framing effect. Another example of a framing effect is that people would rather buy "75% lean" beef than "25% fat" beef, even though both are the same. What's going on here, as far as I can tell, is that reminding people who controls Congress is making their anti-incumbency bias more salient compared to other factors they might use to make the decision. It's not necessarily that they don't know or that they're changing their mind. It's just makes incumbency (a bad issue for GOP right now) more salient.
"real, well-thought-out attitudes, or are simply answers quickly conjured" -- how quickly conjured was this statement? Would it change at all with a prompt reminding you that you know that almost everything you say is not "or" but a mixed of recurring predispositions and a response to immediate stimuli that slightly vary? We all create specifics from a mix of available tools. It is a probabilistic universe, not a black and white one.
I once ran a panel survey comparing responses closed and open-ended assessments of candidates and a vote. Disproportionately, voters chose to express their view by signaling, picking their individual most problematic point on a known underlying continuum of contrasting candidate strengths, thus signaling their agreement with each of the less problematic items. When their minds changed on the most problematic closed ended questions, so did the open ended response. I could not have known what was going on without both closed and open questions and a panel study -- a rarity.
I would frame the significance of David Shor's finding differently. It's not that "public opinion can change" in the passive voice. It's that Democrats taking a stand and fighting hard can change public opinion.
That underscores the fundamental problem with poll-driven politics, a.k.a. "popularism" - how can you know that shutting down the government over Obamacare subsidies will be popular until you take the huge political risk of doing it?
There is an obscure and highly technical word for this approach to politics. It's called "leadership." And I don't think the "science" of polling has a way to measure it.
On the ICE assaulting citizens question, we need crosstabs on how they answered the first part. It's one thing if they approve of ICE because they think it almost never happens; it's quite another thing if they approve because they think it happens a lot. In the latter case, they're pretty much supporting fascism.
On the first poll: The priming effect of previous questions in a poll is so well-known, it was illustrated in a scene of the 1980s British TV show "Yes Prime Minister". (Actually Sir Humphrey uses one more trick, namely acquiescence bias.) Ipsos ran Sir Humphrey's two polls in real life in 2024, and just as predicted, they got a plurality in favour of whichever side the prior questions nudged the respondents towards (45-38 in favour versus 34-48):
The example with the ICE question is more subtle in the difference in wording, but it's the same phenomenon as far as I can see. Or is the point here to get some measure of the magnitude of voters' sensitivity to different sorts of priming, as a proxy for "softness" of opinion?
Thank you for all the thoughtful presentation of the numbers. I’ve been wondering if there are polls about the Supreme Court that would be interesting to review?
I’ve also thought for some time that the best way to handle the issue of trans women competing in women’s sports is to leave it to the athletic governing bodies, on the grounds that the advantage that trans women have over “birth-women” varies from one sport to another. So I’d like to see a question on whether decisions about trans women competing in women’s sports should be made by (a) state governments, (b) the federal government, or (c) governing bodies for each individual sport.
I've heard, but have not seen the data, that in school board elections in 2025, the MAGA candidates lost to people who actually care about education and children rather than rabid ideology.
I’ve thought for a long time that Democratic politicians place too much emphasis on financial aid for college educations and not enough on financial aid for apprenticeships and other “blue-collar” job training that prepares young people for career paths that provide good jobs but don’t require a college education. I’d like to see a poll question that asks people how much they support increased government funding for (a) college education and (b) non-college job training.
Thanks for this! I'll be using the YouGov example of the effects of subtle framing in my research methods course next time I teach it.
Letters From An American charges $5 per month and is therefore the number one subscription. I think you'd make more if you lowered your price. :-)
Two questions:
1. Is there data to compare current D sentiment towards their own party to R sentiment towards the GOP in the Tea Party era (2010 cycle)?
2. How does Trump’s current net *strong* Approve - Disapprove rating compare to the presidential numbers in the last 10 years?
Happy Thanksgiving
Psycholinguist here. The two findings from the beginning of this post, ICE approval in YouGov and the generic ballot results from your Verasight poll, are both examples of framing effects--where you can present the same (or very similar) information in a slightly different way and get readers to come to very different conclusions. Both findings are interesting and in line with previous research, but I would caution against making strong inferences about political strategy based on them. Framing effects tend to be pretty transient, because they're about how people construct situation models to understand a text while they read it. People aren't really changing their minds, they're just evaluating the same question in different framings, where different things are salient.
Issue framing is a key part of politics and does driving voters.
I'm again struck by the knowledge that polls are inherently biased, not so much as a result of construction (sample population selection bias, wording bias, etc) but by the very ignorance of the survey respondents themselves. I consider myself to be very aware of current affairs, and I have a fairly deep understanding of how our government works on all levels - local, state and federal, and the issues facing our continued existence as a democracy. When reading results of surveys, my personal bias shines through because I inherently assume that all respondents are as equally informed as I am. This is obviously not the case. How then can any survey be trusted? The problem is not in our stars, dear Brutus..........
I'm a bit confused by the results of your poll, or, at leas, the way you describe it. Based on the graph, there is only a 1-point difference between responses in the informed vs. uninformed group. Although the sample size is large, I'm guessing this still falls within the margin of error, and would suggest to me that, although a large proportion of voters is lamentably poorly informed, they know who controls Congress. What am I missing?
I'm a psycholinguist, this is what's called a linguistic framing effect. Another example of a framing effect is that people would rather buy "75% lean" beef than "25% fat" beef, even though both are the same. What's going on here, as far as I can tell, is that reminding people who controls Congress is making their anti-incumbency bias more salient compared to other factors they might use to make the decision. It's not necessarily that they don't know or that they're changing their mind. It's just makes incumbency (a bad issue for GOP right now) more salient.
"real, well-thought-out attitudes, or are simply answers quickly conjured" -- how quickly conjured was this statement? Would it change at all with a prompt reminding you that you know that almost everything you say is not "or" but a mixed of recurring predispositions and a response to immediate stimuli that slightly vary? We all create specifics from a mix of available tools. It is a probabilistic universe, not a black and white one.
I once ran a panel survey comparing responses closed and open-ended assessments of candidates and a vote. Disproportionately, voters chose to express their view by signaling, picking their individual most problematic point on a known underlying continuum of contrasting candidate strengths, thus signaling their agreement with each of the less problematic items. When their minds changed on the most problematic closed ended questions, so did the open ended response. I could not have known what was going on without both closed and open questions and a panel study -- a rarity.
'New polling by The Argument shows Democrats gain on the generic ballot when people responding “don’t know” are forced to make a decision.'
Sounds like one of the arguments for ranked-choice voting.
Thanks, as always, for this additional information and analysis. Happy Thanksgiving and good luck with cooking!
I would frame the significance of David Shor's finding differently. It's not that "public opinion can change" in the passive voice. It's that Democrats taking a stand and fighting hard can change public opinion.
That underscores the fundamental problem with poll-driven politics, a.k.a. "popularism" - how can you know that shutting down the government over Obamacare subsidies will be popular until you take the huge political risk of doing it?
There is an obscure and highly technical word for this approach to politics. It's called "leadership." And I don't think the "science" of polling has a way to measure it.
On the ICE assaulting citizens question, we need crosstabs on how they answered the first part. It's one thing if they approve of ICE because they think it almost never happens; it's quite another thing if they approve because they think it happens a lot. In the latter case, they're pretty much supporting fascism.
On the first poll: The priming effect of previous questions in a poll is so well-known, it was illustrated in a scene of the 1980s British TV show "Yes Prime Minister". (Actually Sir Humphrey uses one more trick, namely acquiescence bias.) Ipsos ran Sir Humphrey's two polls in real life in 2024, and just as predicted, they got a plurality in favour of whichever side the prior questions nudged the respondents towards (45-38 in favour versus 34-48):
https://www.ipsos.com/en-uk/yes-prime-minister-questionnaire-design-matters
The example with the ICE question is more subtle in the difference in wording, but it's the same phenomenon as far as I can see. Or is the point here to get some measure of the magnitude of voters' sensitivity to different sorts of priming, as a proxy for "softness" of opinion?
I think it's the latter!
Thank you for all the thoughtful presentation of the numbers. I’ve been wondering if there are polls about the Supreme Court that would be interesting to review?
I’ve also thought for some time that the best way to handle the issue of trans women competing in women’s sports is to leave it to the athletic governing bodies, on the grounds that the advantage that trans women have over “birth-women” varies from one sport to another. So I’d like to see a question on whether decisions about trans women competing in women’s sports should be made by (a) state governments, (b) the federal government, or (c) governing bodies for each individual sport.
Or by local school boards, which is the position of AOC.
I've heard, but have not seen the data, that in school board elections in 2025, the MAGA candidates lost to people who actually care about education and children rather than rabid ideology.
Fact check: True!
I’ve thought for a long time that Democratic politicians place too much emphasis on financial aid for college educations and not enough on financial aid for apprenticeships and other “blue-collar” job training that prepares young people for career paths that provide good jobs but don’t require a college education. I’d like to see a poll question that asks people how much they support increased government funding for (a) college education and (b) non-college job training.